Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The urgent need for AI compute capacity is outpacing grid upgrade timelines, which can take 3-5 years. In response, hyperscalers are installing "behind the meter" power solutions—often less-efficient, simple-cycle natural gas generators—as a pragmatic way to get data centers operational years faster than waiting for utility connections.

Related Insights

The massive electricity demand from AI data centers is creating an urgent need for reliable power. This has caused a surge in demand for natural gas turbines—a market considered dead just years ago—as renewables alone cannot meet the new load.

To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.

The energy demand from AI can be met by allowing data centers to generate their own power "behind the meter." This avoids burdening the public grid and allows data centers to sell excess power back, potentially lowering electricity costs for everyone through economies of scale.

AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.

Contrary to the common focus on chip manufacturing, the immediate bottleneck for building new AI data centers is energy. Factors like power availability, grid interconnects, and high-voltage equipment are the true constraints, forcing companies to explore solutions like on-site power generation.

Contrary to the renewables-focused narrative, the massive, stable energy needs of AI data centers are increasing reliance on natural gas. Underinvestment in grid infrastructure makes gas a critical balancing fuel, now expected to meet a fifth of the world's new power demand (excluding China).

For AI hyperscalers, the primary energy bottleneck isn't price but speed. Multi-year delays from traditional utilities for new power connections create an opportunity cost of approximately $60 million per day for the US AI industry, justifying massive private investment in captive power plants.

The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.

To circumvent grid connection delays, infrastructure costs, and potential consumer rate impacts, data centers are increasingly opting for energy independence. They are deploying on-site power solutions like gas turbines and fuel cells, which can be faster to implement and avoid burdening the local utility system.

While nuclear energy is the ideal long-term solution for AI, its long development timelines are misaligned with the immediate needs of hyperscalers. Natural gas plants, which can be built much faster, will be the essential interim solution, creating a major investment opportunity in the sector.

To Bypass Grid Delays, AI Data Centers Are Deploying Inefficient On-Site Natural Gas Generators | RiffOn