Despite forecasts of massive energy demand growth from AI data centers, forward power curves are flat and natural gas futures are downward sloping. This suggests that sophisticated energy traders do not believe the bullish demand narrative and are not pricing in a future supply crunch.
Contrary to popular belief, recent electricity price hikes are not yet driven by AI demand. Instead, they reflect a system that had already become less reliable due to the retirement of dispatchable coal power and increased dependence on intermittent renewables. The grid was already tight before the current demand wave hit.
The primary bottleneck for scaling AI over the next decade may be the difficulty of bringing gigawatt-scale power online to support data centers. Smart money is already focused on this challenge, which is more complex than silicon supply.
The narrative of energy being a hard cap on AI's growth is largely overstated. AI labs treat energy as a solvable cost problem, not an insurmountable barrier. They willingly pay significant premiums for faster, non-traditional power solutions because these extra costs are negligible compared to the massive expense of GPUs.
The race to build power infrastructure for AI may lead to an oversupply if adoption follows a sigmoid curve. This excess capacity, much like the post-dot-com broadband glut, could become a positive externality that significantly lowers future energy prices for all consumers.
Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."
Contrary to the renewables-focused narrative, the massive, stable energy needs of AI data centers are increasing reliance on natural gas. Underinvestment in grid infrastructure makes gas a critical balancing fuel, now expected to meet a fifth of the world's new power demand (excluding China).
Utilities have firm commitments for 110 gigawatts of data center power capacity, while demand forecasts only predict a need for an additional 50 gigawatts by 2030. This significant discrepancy, based on simple math, points to a potential overbuild and future oversupply in the market.
The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.
Most of the world's energy capacity build-out over the next decade was planned using old models, completely omitting the exponential power demands of AI. This creates a looming, unpriced-in bottleneck for AI infrastructure development that will require significant new investment and planning.
AI labs are flooding utility providers with massive, speculative power requests to secure future capacity. This creates a vicious cycle where everyone asks for more than they need out of fear of missing out, causing gridlock and making it appear there's less available power than actually exists.