Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Features like Codex's '/goal' create a new paradigm of persistent, autonomous agents that can work on a task for days. This shift from active human prompting to unattended 24/7 AI work is expected to cause an exponential increase in token consumption and compute demand, reinforcing the infrastructure boom.

Related Insights

Unlike human-driven growth, which is limited by population and waking hours, AI agents can operate, replicate, and call each other endlessly. This creates a potentially infinite demand for compute infrastructure, far exceeding previous models and leading to massive, unpredictable strains on providers.

Contrary to the view that AI token intensity will drop after the initial coding boom, the move from simple queries to autonomous 'agentic' workflows will cause an order-of-magnitude (10x) increase in token usage per task. This applies across all knowledge-based jobs, ensuring sustained and explosive demand for compute.

The shift from simple chatbots (one user request, one API call) to agentic AI systems will decouple inference requests from direct user actions. A single user request could trigger hundreds or thousands of automated model calls, leading to an exponential increase in compute demand and cost.

Ben Thompson argues the shift from simple chatbots to AI agents creates an exponential, non-speculative demand for compute. Agents automate complex, multi-step tasks, driving constant usage that justifies the massive capex investments by hyperscalers. This suggests the current spending is based on real demand, not bubble-fueled speculation.

The current AI data center arms race isn't about meeting today's demand for chatbots. It's fueled by companies like Meta betting on a future where personal AI agents run constantly, analyzing every interaction. This vision of persistent, parallel agents requires an exponential increase in compute, explaining why they will buy any available capacity.

The shift from simple query-based AI to agentic AI, where AI calls itself recursively to solve complex tasks, increases compute demand by orders of magnitude. Most people, especially non-coders, fail to grasp this exponential shift, leading them to consistently underestimate the scale and duration of the AI infrastructure build-out.

The next wave of AI adoption involves 'agentic' workflows, where AI performs complex tasks autonomously. This shift from simple queries to agentic use is expected to increase token consumption by approximately 10x per task. This will drive a massive explosion in compute demand across all knowledge-work industries, not just coding.

While user growth for apps like ChatGPT is slowing, per-user token consumption is skyrocketing as models shift from simple queries to complex reasoning and AI agents. This creates a hidden, exponential growth in compute demand, validating Oracle's massive infrastructure investment even as front-end adoption matures.

The largest driver of future energy consumption for AI won't be human-initiated queries on chatbots. Instead, it will be the massive, continuous "machine-to-machine" traffic generated by autonomous AI agents performing tasks, which will ultimately swamp human-AI interaction and create a runaway demand for compute power.

The transition from chatbots to autonomous 'agentic' AI represents a fundamental step-change. These agents, which execute complex tasks independently, have already increased the demand for computational power by 1000x, creating a massive, ongoing need for new infrastructure and hardware.