The erosion of trusted, centralized news sources by social media creates an information vacuum. This forces people into a state of 'conspiracy brain,' where they either distrust all information or create flawed connections between unverified data points.
The appeal of complex conspiracies isn't just about information; it's psychological. Believing you are at the center of a vast plot makes life more exciting and meaningful. The realization that one is not important can lead to "secondary depression," making the conspiracy narrative preferable to reality.
Humans crave control. When faced with uncertainty, the brain compensates by creating narratives and seeing patterns where none exist. This explains why a conspiracy theory about a planned event can feel more comforting than a random, chaotic one—the former offers an illusion of understandable order.
Before generative AI, the simple algorithms optimizing newsfeeds for engagement acted as a powerful, yet misaligned, "baby AI." This narrow system, pointed at the human brain, was potent enough to create widespread anxiety, depression, and polarization by prioritizing attention over well-being.
Extremist figures are not organic phenomena but are actively amplified by social media algorithms that prioritize incendiary content for engagement. This process elevates noxious ideas far beyond their natural reach, effectively manufacturing influence for profit and normalizing extremism.
A/B testing on platforms like YouTube reveals a clear trend: the more incendiary and negative the language in titles and headlines, the more clicks they generate. This profit incentive drives the proliferation of outrage-based content, with inflammatory headlines reportedly up 140%.
The online world, particularly platforms like the former Twitter, is not a true reflection of the real world. A small percentage of users, many of whom are bots, generate the vast majority of content. This creates a distorted and often overly negative perception of public sentiment that does not represent the majority view.
The human brain resists ambiguity and seeks closure. When a significant, factual event occurs but is followed by a lack of official information (often for legitimate investigative reasons), this creates an "open loop." People will naturally invent narratives to fill that void, giving rise to conspiracy theories.
Effective political propaganda isn't about outright lies; it's about controlling the frame of reference. By providing a simple, powerful lens through which to view a complex situation, leaders can dictate the terms of the debate and trap audiences within their desired narrative, limiting alternative interpretations.
A two-step analytical method to vet information: First, distinguish objective (multi-source, verifiable) facts from subjective (opinion-based) claims. Second, assess claims on a matrix of probability and source reliability. A low-reliability source making an improbable claim, like many conspiracy theories, should be considered highly unlikely.
Before ChatGPT, humanity's "first contact" with rogue AI was social media. These simple, narrow AIs optimizing solely for engagement were powerful enough to degrade mental health and democracy. This "baby AI" serves as a stark warning for the societal impact of more advanced, general AI systems.