Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The demand for electricity from AI is growing faster than the grid's bureaucratic capacity to expand. Doomberg predicts most new data centers will need to generate their own power, likely from natural gas, to bypass connection bottlenecks and avoid causing retail electricity price spikes for consumers.

Related Insights

To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.

The energy demand from AI can be met by allowing data centers to generate their own power "behind the meter." This avoids burdening the public grid and allows data centers to sell excess power back, potentially lowering electricity costs for everyone through economies of scale.

AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.

The massive energy consumption of AI data centers is causing electricity demand to spike for the first time in 70 years, a surge comparable to the widespread adoption of air conditioning. This is forcing tech giants to adopt a "Bring Your Own Power" (BYOP) policy, essentially turning them into energy producers.

Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."

Contrary to the common focus on chip manufacturing, the immediate bottleneck for building new AI data centers is energy. Factors like power availability, grid interconnects, and high-voltage equipment are the true constraints, forcing companies to explore solutions like on-site power generation.

The massive power demands of AI will force hyperscalers to abandon their reliance on the public grid. They will build dedicated, co-located power plants, likely small modular nuclear reactors. This "Bring Your Own Energy" approach ensures speed to power and creates opportunities to sell excess energy back to communities.

Just two years ago, suggesting a data center operate off-grid was unthinkable. Today, because the public grid cannot support the massive power demands of AI, building dedicated, on-site power generation ('behind the meter') has rapidly become the new industry norm.

The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.

The urgent need for AI compute capacity is outpacing grid upgrade timelines, which can take 3-5 years. In response, hyperscalers are installing "behind the meter" power solutions—often less-efficient, simple-cycle natural gas generators—as a pragmatic way to get data centers operational years faster than waiting for utility connections.