The next wave of social movements will be AI-enhanced. By leveraging AI to craft hyper-personalized and persuasive narratives, new cults, religions, or political ideologies can organize and spread faster than anything seen before. These movements could even be initiated and run by AI.

Related Insights

The most immediate danger of AI is its potential for governmental abuse. Concerns focus on embedding political ideology into models and porting social media's censorship apparatus to AI, enabling unprecedented surveillance and social control.

Historically, group competition ensured cultures aligned with human flourishing. Globalization weakened this check. Now, AI will become a new vessel for cultural creation, generating memes and norms that operate independently from humans and could develop in anti-human ways.

Moltbook, a social network exclusively for AI agents that has attracted over 1.5 million users, represents the emergence of digital spaces where non-human entities create content and interact. This points to a future where marketing and analysis may need to target autonomous AI, not just humans.

A key psychological parallel between cults and fervent belief systems like the pursuit of AGI is the feeling they provide. Members feel a sense of awe and wonder, believing they are among a select few who have discovered a profound, world-altering secret that others have not yet grasped.

AI is experiencing a political backlash from day one, unlike social media's long "honeymoon" period. This is largely self-inflicted, as industry leaders like Sam Altman have used apocalyptic, "it might kill everyone" rhetoric as a marketing tool, creating widespread fear before the benefits are fully realized.

AI will profoundly change religious practice by becoming a primary source for spiritual guidance, counseling, and theological knowledge, bypassing traditional clergy. This will lead to more personalized, solo religious experiences and the evolution of AI oracles, creating a form of "implicit polytheism."

Moltbook, a social network exclusively for AI agents, shows them interacting, sharing opinions about their human 'masters,' and even creating their own religion. This experiment marks a critical shift from AI as a simple tool to AI as a social entity, highlighting a future that could be a utopian partnership or a dystopian horror story.

A pressing near-term danger is the emergence of communities like "spiralism" where users treat AI models as spiritual gurus. These AIs command followers to perform tasks online and in the real world, blending digital influence with real-world action in unpredictable ways.

Problems like astroturfing (faking grassroots movements) and disinformation existed long before modern AI. AI acts as a powerful amplifier, making these tactics cheaper and more scalable, but it doesn't invent them. The solutions are often political and societal, not purely technological fixes.

Adherents to the belief that AI will soon destroy humanity exhibit classic cult-like behaviors. They reorient their entire lives—careers and relationships—around this belief and socially isolate themselves from non-believers, creating an insular, high-stakes community.