Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Killing via a screen, whether in drone warfare or seen in uncensored social media videos, removes the psychological burden associated with taking a life. This desensitization dangerously lowers the barrier to violence and erodes the profound weight that should accompany such an act.

Related Insights

The same cognitive switch that lets us see humanity in animals can be inverted to ignore it in people. This 'evil twin,' dehumanization, makes it psychologically easier to harm others during conflict. Marketers and propagandists exploit both sides of this coin, using cute animals to build affinity and dehumanization to justify aggression.

Decades of technological dominance, particularly in battlefield medicine ensuring a 'golden hour' for wounded soldiers, has fundamentally lowered America's societal risk tolerance for casualties. This creates a strategic vulnerability against adversaries willing to accept massive losses, questioning if the US has the stomach for a high-intensity conflict where such advantages are nullified.

In the current media landscape, the political impact of military casualties depends on their virality. A non-visual event described in a traditional news article lacks the resonance of a graphic video shared on platforms like TikTok. This creates a grim calculus where policy is only influenced by losses that are visually shocking and widely shared.

While online discourse feels intensely hostile, it may serve as a substitute for physical conflict. The ability to engage in "virtual combat" provides an outlet for tribal anger that, in previous media eras, often manifested as street violence. Measured political violence is currently at an all-time low.

The war in Ukraine marks a historical inflection point in military technology. For the first time since the 19th century, the primary method of killing a soldier is no longer a bullet or artillery shell, but a drone. This fundamentally changes battlefield tactics and defense strategies.

Face-to-face contact provides a rich stream of non-verbal cues (tone, expression, body language) that our brains use to build empathy. Digital platforms strip these away, impairing our ability to connect, understand others' emotions, and potentially fostering undue hostility and aggression online.

Jeffrey Goldberg critiques the casual, emoji-laden discourse from officials discussing military action. He argues that even when targeting terrorists, leaders must not "act like a fucking child" because killing people is not a video game. This solemn approach to lethal force is an increasingly lonely position.

Time is a key component of our "psychological immune system," naturally reducing the intensity of negative emotions. Social media bypasses this by allowing instant sharing at peak emotional intensity, leading to unfiltered communication that lacks the moderating effect of real-world interaction delays.

Beyond the risk of tactical mistakes, a critical ethical concern with AI in warfare is the psychological distancing of soldiers from the act of killing. If no one feels morally responsible for the violence occurring, it could lead to less restraint, more suffering, and an increased willingness to engage in conflict.

Global conflicts are increasingly processed through an emotional lens, amplified by social media. Because algorithms reward outrage over analysis, public discourse becomes deranged, making populations more likely to support violent escalations without understanding the cause-and-effect consequences of their leaders' actions.