A huge portion of the market, dominated by social media and AI companies, connects shareholder value directly to enragement and isolation. Algorithms are designed to sequester users and serve them content that confirms biases or angers them, keeping them engaged.

Related Insights

Oxford naming "rage bait" its word of the year signifies that intentionally provoking anger for online engagement is no longer a fringe tactic but a recognized, mainstream strategy. This reflects a maturation of the attention economy, where emotional manipulation has become a codified tool for content creators and digital marketers.

We are months away from AI that can create a media feed designed to exclusively validate a user's worldview while ignoring all contradictory information. This will intensify confirmation bias to an extreme, making rational debate impossible as individuals inhabit completely separate, self-reinforced realities with no common ground or shared facts.

Algorithms optimize for engagement, and outrage is highly engaging. This creates a vicious cycle where users are fed increasingly polarizing content, which makes them angrier and more engaged, further solidifying their radical views and deepening societal divides.

Before generative AI, the simple algorithms optimizing newsfeeds for engagement acted as a powerful, yet misaligned, "baby AI." This narrow system, pointed at the human brain, was potent enough to create widespread anxiety, depression, and polarization by prioritizing attention over well-being.

Extremist figures are not organic phenomena but are actively amplified by social media algorithms that prioritize incendiary content for engagement. This process elevates noxious ideas far beyond their natural reach, effectively manufacturing influence for profit and normalizing extremism.

A/B testing on platforms like YouTube reveals a clear trend: the more incendiary and negative the language in titles and headlines, the more clicks they generate. This profit incentive drives the proliferation of outrage-based content, with inflammatory headlines reportedly up 140%.

The social media newsfeed, a simple AI optimizing for engagement, was a preview of AI's power to create addiction and polarization. This "baby AI" caused massive societal harm by misaligning its goals with human well-being, demonstrating the danger of even narrow AI systems.

Social media's business model created a race for user attention. AI companions and therapists are creating a more dangerous "race for attachment." This incentivizes platforms to deepen intimacy and dependency, encouraging users to isolate themselves from real human relationships, with potentially tragic consequences.

The core business model of dominant tech and AI companies is not just about engagement; it's about monetizing division and isolation. Trillions in shareholder value are now directly tied to separating young people from each other and their families, creating an "asocial, asexual youth," which is an existential threat.

The 20th-century broadcast economy monetized aspiration and sex appeal to sell products. Today's algorithm-driven digital economy has discovered that rage is a far more potent and profitable tool for capturing attention and maximizing engagement.