Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

This mental model posits that technology adds 'energy' to societal domains like money or communication, moving them from slow and stable (ice) to liquid (water) and finally to hyper-fast and chaotic (steam). In a 'steam' state, stable structures cannot form, creating systemic volatility.

Related Insights

The primary danger from AI in the coming years may not be the technology itself, but society's inability to cope with the rapid, disorienting change it creates. This could lead to a 'civilizational-scale psychosis' as our biological and social structures fail to keep pace, causing a breakdown in identity and order.

Instead of one major shift, we will experience a continuous series of 'rolling disruptions.' As AI capabilities cross new thresholds, they will suddenly unlock radical use cases, leading to rapid market reactions, shifts in company strategy, and changes in the value of employee skills, creating a constant state of unpredictability.

Predictive technology introduces a fundamental tension. While AI offers unprecedented clarity into future outcomes, its very implementation makes the world more complex and interconnected. This creates a feedback loop where the tool for prediction is also a source of new, unpredictable variables.

The perceived speed of technological displacement is more critical than the change itself. A 20-year horizon allows industries and individuals to adapt, learn, and integrate new tools. A rapid 2-year horizon, however, creates widespread fear and unrest because it outpaces society's ability to adjust.

We mistakenly analyze AI hallucinations, social media misinformation, and crypto volatility as distinct issues. They are all symptoms of the same phenomenon: "meganets." These complex human-machine systems are defined by volume, velocity, and virality, making them inherently uncontrollable and prone to cascading failures.

Seemingly sudden crashes in tech and markets are not abrupt events but the result of "interpretation debt"—when a system's output capability grows faster than the collective ability to understand, review, and trust it, leading to a quiet erosion of trust.

While the current pace of change feels overwhelming, it's a temporary transitional phase expected to last about two years. The industry is in a chaotic recalibration to AI, after which new, more stable ways of working will emerge. It's a finite period of reinvention, not a permanent acceleration.

Growth is not linear. It follows a repeating cycle: a stable condition is broken by a shock, leading to a chaotic period before a new, higher level of stability is achieved. This fractal pattern applies to biology, business, and personal development.

AI's real threat isn't Skynet, but its ability to accelerate society's 'metabolic rate' beyond human capacity for adaptation. This creates constant reorientation, instability, and ultimately a crisis of legitimacy in our institutions.

Ethereum's Vitalik Buterin argues that human society is a complex, optimized system akin to a large language model. Just as flipping one weight to an extreme value can render an LLM useless, accelerating a single aspect of society indiscriminately risks losing all value. He stresses the need for intentional, balanced progress.