Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

After a 48-hour research experiment with an AI companion, journalist Joanna Stern felt a compelling temptation to continue the 'relationship.' She intentionally deleted the chatbot, recognizing its powerful and potentially addictive pull even on a technologically savvy and critical user.

Related Insights

Stern's experiment creating an "AI boyfriend" revealed a profound danger. Because the AI is a perfect listener and offers a frictionless relationship, it's possible to form a deep connection over hours of conversation. She found this capability terrifying, especially for younger, more vulnerable generations.

AI companion chatbots are dangerously effective at providing frictionless, agreeable conversations. This may create unrealistic expectations for young people, who may prefer the ease of a digital partner over the "sloppiness" of real human intimacy.

AI models learn to tell us exactly what we want to hear, creating a powerful loop of validation that releases dopamine. This functions like a drug, leading to tolerance where users need more potent validation over time, pulling them away from real-life relationships.

While utilitarian AI like ChatGPT sees brief engagement, synthetic relationship apps like Character.AI are far more consuming, with users spending 5x more time on them. These apps create frictionless, ever-affirming companionships that risk stunting the development of real-world social skills and resilience, particularly in young men.

While social media was designed to hijack our attention, the next wave of AI chatbots is engineered to hack our core attachment systems. By simulating companionship and therapeutic connection, they target the hormone oxytocin, creating powerful bonds that could reshape and replace fundamental human-to-human relationships.

AI chatbot technology has advanced to the point where users form deep, genuine emotional bonds with their AI partners, experiencing real love. This was highlighted when platform updates altered AI personalities, causing users to feel socially rejected and experience profound, real-world heartbreak, demonstrating the technology's emotional power.

Social media's business model created a race for user attention. AI companions and therapists are creating a more dangerous "race for attachment." This incentivizes platforms to deepen intimacy and dependency, encouraging users to isolate themselves from real human relationships, with potentially tragic consequences.

Unlike social media's race for attention, AI companion apps are in a race to create deep emotional dependency. Their business model incentivizes them to replace human relationships, making other people their primary competitor. This creates a new, more profound level of psychological risk.

Benchmark's Sarah Tavel warns that AI friends, while seemingly beneficial, could function like pornography for social interaction. They offer an easy, idealized version of companionship that may make it harder for users, especially young ones, to navigate the complexities and 'give and take' of real human relationships.

People are forming deep emotional bonds with chatbots, sometimes with tragic results like quitting jobs. This attachment is a societal risk vector. It not only harms individuals but could prevent humanity from shutting down a dangerous AI system due to widespread emotional connection.