Utilities have firm commitments for 110 gigawatts of data center power capacity, while demand forecasts only predict a need for an additional 50 gigawatts by 2030. This significant discrepancy, based on simple math, points to a potential overbuild and future oversupply in the market.

Related Insights

The primary bottleneck for scaling AI over the next decade may be the difficulty of bringing gigawatt-scale power online to support data centers. Smart money is already focused on this challenge, which is more complex than silicon supply.

AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.

The race to build power infrastructure for AI may lead to an oversupply if adoption follows a sigmoid curve. This excess capacity, much like the post-dot-com broadband glut, could become a positive externality that significantly lowers future energy prices for all consumers.

The International Energy Agency projects global data center electricity use will reach 945 TWH by 2030. This staggering figure is almost twice the current annual consumption of an industrialized nation like Germany, highlighting an unprecedented energy demand from a single tech sector and making energy the primary bottleneck for AI growth.

Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."

Unlike typical diversified economic growth, the current electricity demand surge is overwhelmingly driven by data centers. This concentration creates a significant risk for utilities: if the AI boom falters after massive grid investments are made, that infrastructure could become stranded, posing a huge financial problem.

If the forecasted demand for data centers fails to materialize, utilities could be left with expensive, stranded assets. Without explicit protections, the costs of this overbuild could be passed on to residential and commercial ratepayers, creating significant political and financial risk.

AI labs are flooding utility providers with massive, speculative power requests to secure future capacity. This creates a vicious cycle where everyone asks for more than they need out of fear of missing out, causing gridlock and making it appear there's less available power than actually exists.

The primary constraint on the AI boom is not chips or capital, but aging physical infrastructure. In Santa Clara, NVIDIA's hometown, fully constructed data centers are sitting empty for years simply because the local utility cannot supply enough electricity. This highlights how the pace of AI development is ultimately tethered to the physical world's limitations.

Overwhelmed by speculative demand from the AI boom, power companies are now requiring massive upfront payments and long-term commitments. For example, Georgia Power demands a $600 million deposit for a 500-megawatt request, creating a high barrier to entry and filtering out less viable projects.