Instead of forcing AI companies to subsidize electricity bills directly, a more effective solution is a broad-based corporate minimum tax. This provides the public capital needed for massive infrastructure projects, like upgrading the national power grid to handle increased demand from data centers, without complex, targeted regulations.

Related Insights

While currently straining power grids, AI data centers have the potential to become key stabilizing partners. By coordinating their massive power draw—for example, giving notice before ending a training run—they can help manage grid load and uncertainty, ultimately reducing overall system costs and improving stability in a decentralized energy network.

To overcome local opposition, tech giants should use their massive balance sheets to provide tangible economic benefits to host communities. Subsidizing local electricity bills or funding renewable energy projects can turn residents into supporters, clearing the path for essential AI infrastructure development.

Taxing a specific industry like AI is problematic as it invites lobbying and creates definitional ambiguity. A more effective and equitable approach is broad tax reform, such as eliminating the capital gains deduction, to create a fairer system for all income types, regardless of the source industry.

Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."

Unlike typical diversified economic growth, the current electricity demand surge is overwhelmingly driven by data centers. This concentration creates a significant risk for utilities: if the AI boom falters after massive grid investments are made, that infrastructure could become stranded, posing a huge financial problem.

The public is unlikely to approve government guarantees for private AI data centers amid economic hardship. A more palatable strategy is investing in energy infrastructure. This move benefits all citizens with potentially lower power bills while still providing the necessary resources for the AI industry's growth.

Rather than viewing the massive energy demand of AI as just a problem, it's an opportunity. Politician Alex Boris argues governments should require the private capital building data centers to also pay for necessary upgrades to the aging electrical grid, instead of passing those costs on to public ratepayers.

The rapid build-out of data centers to power AI is consuming so much energy that it's creating a broad, national increase in electricity costs. This trend is now a noticeable factor contributing to CPI inflation and is expected to persist.

Geopolitical competition with China has forced the U.S. government to treat AI development as a national security priority, similar to the Manhattan Project. This means the massive AI CapEx buildout will be implicitly backstopped to prevent an economic downturn, effectively turning the sector into a regulated utility.

Most of the world's energy capacity build-out over the next decade was planned using old models, completely omitting the exponential power demands of AI. This creates a looming, unpriced-in bottleneck for AI infrastructure development that will require significant new investment and planning.