Rather than viewing the massive energy demand of AI as just a problem, it's an opportunity. Politician Alex Boris argues governments should require the private capital building data centers to also pay for necessary upgrades to the aging electrical grid, instead of passing those costs on to public ratepayers.
While currently straining power grids, AI data centers have the potential to become key stabilizing partners. By coordinating their massive power draw—for example, giving notice before ending a training run—they can help manage grid load and uncertainty, ultimately reducing overall system costs and improving stability in a decentralized energy network.
To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.
Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."
Unlike typical diversified economic growth, the current electricity demand surge is overwhelmingly driven by data centers. This concentration creates a significant risk for utilities: if the AI boom falters after massive grid investments are made, that infrastructure could become stranded, posing a huge financial problem.
Facing immense electricity needs for AI, tech giants like Amazon are now directly investing in nuclear power, particularly small modular reactors (SMRs). This infusion of venture capital is revitalizing a sector that has historically relied on slow-moving government funding, imbuing it with a Silicon Valley spirit.
While semiconductor access is a critical choke point, the long-term constraint on U.S. AI dominance is energy. Building massive data centers requires vast, stable power, but the U.S. faces supply chain issues for energy hardware and lacks a unified grid. China, in contrast, is strategically building out its energy infrastructure to support its AI ambitions.
To secure the immense, stable power required for AI, tech companies are pursuing plans to co-locate hyperscale data centers with dedicated Small Modular Reactors (SMRs). These "nuclear computation hubs" create a private, reliable baseload power source, making the data center independent of the increasingly strained public electrical grid.
The rapid build-out of data centers to power AI is consuming so much energy that it's creating a broad, national increase in electricity costs. This trend is now a noticeable factor contributing to CPI inflation and is expected to persist.
Most of the world's energy capacity build-out over the next decade was planned using old models, completely omitting the exponential power demands of AI. This creates a looming, unpriced-in bottleneck for AI infrastructure development that will require significant new investment and planning.
The primary factor for siting new AI hubs has shifted from network routes and cheap land to the availability of stable, large-scale electricity. This creates "strategic electricity advantages" where regions with reliable grids and generation capacity are becoming the new epicenters for AI infrastructure, regardless of their prior tech hub status.