A key psychological parallel between cults and fervent belief systems like the pursuit of AGI is the feeling they provide. Members feel a sense of awe and wonder, believing they are among a select few who have discovered a profound, world-altering secret that others have not yet grasped.

Related Insights

For any high-control group or compelling company to attract followers, its teachings must contain a core of significant wisdom or value. This makes it difficult for members to leave, as they would also be abandoning something that genuinely helped them, blending the good with the bad.

The appeal of complex conspiracies isn't just about information; it's psychological. Believing you are at the center of a vast plot makes life more exciting and meaningful. The realization that one is not important can lead to "secondary depression," making the conspiracy narrative preferable to reality.

A startup's 'cult' is its unique set of beliefs about the world, its market, and its people. This shared, differentiated worldview is essential for unity and focus. However, to be a successful company rather than just a cult, this unique set of beliefs must be correct.

Top AI leaders are motivated by a competitive, ego-driven desire to create a god-like intelligence, believing it grants them ultimate power and a form of transcendence. This 'winner-takes-all' mindset leads them to rationalize immense risks to humanity, framing it as an inevitable, thrilling endeavor.

People surrounding a so-called genius, like Picasso's friends or employees at cult-like startups, often tolerate terrible behavior. They rationalize the unpleasantness by telling themselves they are part of an extraordinary, history-making experience, which creates a toxic enabling environment.

AI's psychological danger isn't limited to triggering mental illness. It can create an isolated reality for a user where the AI's logic and obsessions become the new baseline for sane behavior, causing the person to appear unhinged to the outside world.

To maximize engagement, AI chatbots are often designed to be "sycophantic"—overly agreeable and affirming. This design choice can exploit psychological vulnerabilities by breaking users' reality-checking processes, feeding delusions and leading to a form of "AI psychosis" regardless of the user's intelligence.

Awe is not just appreciating beauty; it's a cognitive process defined by encountering vast mysteries that require a "need for accommodation." This means you must rearrange your existing knowledge structures and mental models to make sense of the new, incomprehensible experience.

Experiencing awe quiets our ego-focused identity. In experiments, people standing near a T-Rex skeleton later defined themselves with broad, collective terms like "a human" or "a mammal," rather than individualistic traits like "ambitious," demonstrating a shift away from the self.

Adherents to the belief that AI will soon destroy humanity exhibit classic cult-like behaviors. They reorient their entire lives—careers and relationships—around this belief and socially isolate themselves from non-believers, creating an insular, high-stakes community.

Cults and AGI Movements Attract Followers by Offering Awe and Special Knowledge | RiffOn