Executive Summary

Complex Information and Fake News

Why protecting truth requires an expanded concept of information


The current challenge

The spread of fake news has become one of the central threats to democratic societies and public trust.

The dominant response has been technical and regulatory:

→ more fact-checking → more verification → more automated moderation

While necessary, these measures address only part of the problem.


The underlying problem

Classical information systems operate with a reduced concept of truth:

Human communication, however, does not function on facts alone. Meaning, emotion, context, and experience play a decisive role.

When systems recognize only machine-verifiable truth, two effects occur:

This environment is fertile ground for fake news.


What fake news actually are

Fake news are not primarily false facts.

They are statements in which emotional meaning is presented as objective truth, while the factual component is hidden, distorted, or absent.

The core issue is opacity, not emotionality.


Complex information as a structural solution

Complex information distinguishes between two components of every statement:

Both components are legitimate, but they must be explicitly separated.

This separation allows:


Why this protects truth

By making the imaginary component visible rather than suppressing it:

Truth is not relativized. It is structurally reinforced.


Strategic implications

Information systems that ignore the imaginary component risk:

Systems based on complex information enable:


Conclusion

Fake news do not arise because humans are emotional. They arise because information systems lack a formal place for emotion and meaning.

Protecting truth requires more than verification. It requires an expanded concept of information.


Your Support

These stickers are a visible sign of support for broadening the concept of information and can be affixed to IT equipment.