AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.

Related Insights

While currently straining power grids, AI data centers have the potential to become key stabilizing partners. By coordinating their massive power draw—for example, giving notice before ending a training run—they can help manage grid load and uncertainty, ultimately reducing overall system costs and improving stability in a decentralized energy network.

To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.

The race to build power infrastructure for AI may lead to an oversupply if adoption follows a sigmoid curve. This excess capacity, much like the post-dot-com broadband glut, could become a positive externality that significantly lowers future energy prices for all consumers.

Contrary to the common focus on chip manufacturing, the immediate bottleneck for building new AI data centers is energy. Factors like power availability, grid interconnects, and high-voltage equipment are the true constraints, forcing companies to explore solutions like on-site power generation.

Rather than viewing the massive energy demand of AI as just a problem, it's an opportunity. Politician Alex Boris argues governments should require the private capital building data centers to also pay for necessary upgrades to the aging electrical grid, instead of passing those costs on to public ratepayers.

For AI hyperscalers, the primary energy bottleneck isn't price but speed. Multi-year delays from traditional utilities for new power connections create an opportunity cost of approximately $60 million per day for the US AI industry, justifying massive private investment in captive power plants.

To secure the immense, stable power required for AI, tech companies are pursuing plans to co-locate hyperscale data centers with dedicated Small Modular Reactors (SMRs). These "nuclear computation hubs" create a private, reliable baseload power source, making the data center independent of the increasingly strained public electrical grid.

The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.

To circumvent grid connection delays, infrastructure costs, and potential consumer rate impacts, data centers are increasingly opting for energy independence. They are deploying on-site power solutions like gas turbines and fuel cells, which can be faster to implement and avoid burdening the local utility system.

The primary factor for siting new AI hubs has shifted from network routes and cheap land to the availability of stable, large-scale electricity. This creates "strategic electricity advantages" where regions with reliable grids and generation capacity are becoming the new epicenters for AI infrastructure, regardless of their prior tech hub status.