While a new gas plant's cost has soared to $3,000 per KW, the data center it powers costs $40,000 per KW. For tech giants, paying a huge premium to secure a dedicated power source is an insignificant rounding error, explaining their willingness to pay far above-market rates for electricity.
Instead of socializing costs, some utilities are charging data centers premium rates. This revenue not only covers new infrastructure costs but, in some cases like Georgia, is used to provide bill credits or reductions to existing residential and commercial customers, effectively subsidizing them.
When power (watts) is the primary constraint for data centers, the total cost of compute becomes secondary. The crucial metric is performance-per-watt. This gives a massive pricing advantage to the most efficient chipmakers, as customers will pay anything for hardware that maximizes output from their limited power budget.
To overcome energy bottlenecks, political opposition, and grid reliability issues, AI data center developers are building their own dedicated, 'behind-the-meter' power plants. This strategy, typically using natural gas, ensures a stable power supply for their massive operations without relying on the public grid.
The narrative of energy being a hard cap on AI's growth is largely overstated. AI labs treat energy as a solvable cost problem, not an insurmountable barrier. They willingly pay significant premiums for faster, non-traditional power solutions because these extra costs are negligible compared to the massive expense of GPUs.
Contrary to the common focus on chip manufacturing, the immediate bottleneck for building new AI data centers is energy. Factors like power availability, grid interconnects, and high-voltage equipment are the true constraints, forcing companies to explore solutions like on-site power generation.
While costs for essentials like copper and electricity are rising, cash-rich hyperscalers (Google, Meta) will continue building. The real pressure will be on smaller, capital-dependent players like CoreWeave, who may struggle to secure financing as investors scrutinize returns, leading to canceled projects on the margin.
For AI hyperscalers, the primary energy bottleneck isn't price but speed. Multi-year delays from traditional utilities for new power connections create an opportunity cost of approximately $60 million per day for the US AI industry, justifying massive private investment in captive power plants.
The public power grid cannot support the massive energy needs of AI data centers. This will force a shift toward on-site, "behind-the-meter" power generation, likely using natural gas, where data centers generate their own power and only "sip" from the grid during off-peak times.
To circumvent grid connection delays, infrastructure costs, and potential consumer rate impacts, data centers are increasingly opting for energy independence. They are deploying on-site power solutions like gas turbines and fuel cells, which can be faster to implement and avoid burdening the local utility system.
To overcome local opposition, hyperscalers are creating novel utility contracts that have zero financial impact on local ratepayers. They agree to guarantee a return on the utility's specific capital expenditures, ensuring data center costs are not passed on to other customers.