The rapid build-out of data centers to power AI is consuming so much energy that it's creating a broad, national increase in electricity costs. This trend is now a noticeable factor contributing to CPI inflation and is expected to persist.

Related Insights

The massive electricity demand from AI data centers is creating an urgent need for reliable power. This has caused a surge in demand for natural gas turbines—a market considered dead just years ago—as renewables alone cannot meet the new load.

The International Energy Agency projects global data center electricity use will reach 945 TWH by 2030. This staggering figure is almost twice the current annual consumption of an industrialized nation like Germany, highlighting an unprecedented energy demand from a single tech sector and making energy the primary bottleneck for AI growth.

For 2026, AI's primary economic effect is fueling demand through massive investment in infrastructure like data centers. The widely expected productivity gains that would lower inflation (the supply-side effect) won't materialize for a few years, creating a short-term inflationary pressure from heightened business spending.

Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."

Unlike typical diversified economic growth, the current electricity demand surge is overwhelmingly driven by data centers. This concentration creates a significant risk for utilities: if the AI boom falters after massive grid investments are made, that infrastructure could become stranded, posing a huge financial problem.

Before AI delivers long-term deflationary productivity, it requires a massive, inflationary build-out of physical infrastructure. This makes sectors like utilities, pipelines, and energy infrastructure a timely hedge against inflation and a diversifier away from concentrated tech bets.

While semiconductor access is a critical choke point, the long-term constraint on U.S. AI dominance is energy. Building massive data centers requires vast, stable power, but the U.S. faces supply chain issues for energy hardware and lacks a unified grid. China, in contrast, is strategically building out its energy infrastructure to support its AI ambitions.

Most of the world's energy capacity build-out over the next decade was planned using old models, completely omitting the exponential power demands of AI. This creates a looming, unpriced-in bottleneck for AI infrastructure development that will require significant new investment and planning.

The infrastructure demands of AI have caused an exponential increase in data center scale. Two years ago, a 1-megawatt facility was considered a good size. Today, a large AI data center is a 1-gigawatt facility—a 1000-fold increase. This rapid escalation underscores the immense and expensive capital investment required to power AI.

The primary constraint for scaling high-frequency trading operations has shifted from minimizing latency (e.g., shorter wires) to securing electricity. Even for a firm like Hudson River Trading, which is smaller than tech giants, negotiating for power grid access is the main bottleneck for building new GPU data centers.