Advanced AR glasses create a new social problem of "deep fake eye contact," where users can feign presence in a conversation while mentally multitasking. This technology threatens to erode genuine human connection by making it impossible to know if you have someone's true attention.

Related Insights

The act of looking at someone's eyes—the part of them that does the looking—creates an unbreakable feedback loop of "I know you know I know..." This immediately establishes common knowledge, forcing a resolution to the social game being played, whether it's a threat, a challenge, or an invitation.

As AI-generated content and virtual influencers saturate social media, consumer trust will erode, leading to 'Peak Social.' This wave of distrust will drive people away from anonymous influencers and back towards known entities and credible experts with genuine authority in their fields.

While businesses are rapidly adopting AI for content creation and communication, Gen Z consumers have a strong aversion to anything that feels artificial or inauthentic. If this demographic can detect AI-generated content in sales or marketing, they are likely to ignore it, posing a significant challenge for brands targeting them.

Beyond economic disruption, AI's most immediate danger is social. By providing synthetic relationships and on-demand companionship, AI companies have an economic incentive to evolve an “asocial species of young male.” This could lead to a generation sequestered from society, unwilling to engage in the effort of real-world relationships.

Face-to-face contact provides a rich stream of non-verbal cues (tone, expression, body language) that our brains use to build empathy. Digital platforms strip these away, impairing our ability to connect, understand others' emotions, and potentially fostering undue hostility and aggression online.

Instead of visually-obstructive headsets or glasses, the most practical and widely adopted form of AR will be audio-based. The evolution of Apple's AirPods, integrated seamlessly with an iPhone's camera and AI, will provide contextual information without the social and physical friction of wearing a device on your face.

Social media's business model created a race for user attention. AI companions and therapists are creating a more dangerous "race for attachment." This incentivizes platforms to deepen intimacy and dependency, encouraging users to isolate themselves from real human relationships, with potentially tragic consequences.

The core business model of dominant tech and AI companies is not just about engagement; it's about monetizing division and isolation. Trillions in shareholder value are now directly tied to separating young people from each other and their families, creating an "asocial, asexual youth," which is an existential threat.

Unlike social media's race for attention, AI companion apps are in a race to create deep emotional dependency. Their business model incentivizes them to replace human relationships, making other people their primary competitor. This creates a new, more profound level of psychological risk.

We don't perceive reality directly; our brain constructs a predictive model, filling in gaps and warping sensory input to help us act. Augmented reality isn't a tech fad but an intuitive evolution of this biological process, superimposing new data onto our brain's existing "controlled model" of the world.