Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Just two years ago, suggesting a data center operate off-grid was unthinkable. Today, because the public grid cannot support the massive power demands of AI, building dedicated, on-site power generation ('behind the meter') has rapidly become the new industry norm.

Related Insights

Power for AI data centers is not limited to the traditional grid or a few turbine suppliers. Operators are turning to a diverse portfolio of 'behind-the-meter' power sources, including repurposed jet engines (aeroderivatives), large reciprocating engines from ships and trucks, and fuel cells to rapidly scale capacity.

To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.

The energy demand from AI can be met by allowing data centers to generate their own power "behind the meter." This avoids burdening the public grid and allows data centers to sell excess power back, potentially lowering electricity costs for everyone through economies of scale.

AI companies are building their own power plants due to slow utility responses. They overbuild for reliability, and this excess capacity will eventually be sold back to the grid, transforming them into desirable sources of cheap, local energy for communities within five years.

The massive energy consumption of AI data centers is causing electricity demand to spike for the first time in 70 years, a surge comparable to the widespread adoption of air conditioning. This is forcing tech giants to adopt a "Bring Your Own Power" (BYOP) policy, essentially turning them into energy producers.

The massive power demands of AI will force hyperscalers to abandon their reliance on the public grid. They will build dedicated, co-located power plants, likely small modular nuclear reactors. This "Bring Your Own Energy" approach ensures speed to power and creates opportunities to sell excess energy back to communities.

The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.

To circumvent grid connection delays, infrastructure costs, and potential consumer rate impacts, data centers are increasingly opting for energy independence. They are deploying on-site power solutions like gas turbines and fuel cells, which can be faster to implement and avoid burdening the local utility system.

The "across the meter" concept involves co-locating power generation with a data center and a grid interconnection. This allows the data center to consume the power it needs, draw from the grid to cover shortfalls, and, crucially, supply its excess generated power back to the grid. This transforms a major power consumer into a source of energy abundance for the local community.

The urgent need for AI compute capacity is outpacing grid upgrade timelines, which can take 3-5 years. In response, hyperscalers are installing "behind the meter" power solutions—often less-efficient, simple-cycle natural gas generators—as a pragmatic way to get data centers operational years faster than waiting for utility connections.