Credit investors should look beyond direct AI companies. According to Victoria Fernandez, the massive infrastructure build-out for AI creates a significant tailwind for power and energy companies, offering a less crowded investment thesis with potentially wider spreads and strong fundamentals.
Current M&A activity related to AI isn't targeting AI model creators. Instead, capital is flowing into consolidating the 'picks and shovels' of the AI ecosystem. This includes derivative plays like data centers, semiconductors, software, and even power suppliers, which are seen as more tangible long-term assets.
The massive energy consumption of AI has made tech giants the most powerful force advocating for new power sources. Their commercial pressure is finally overcoming decades of regulatory inertia around nuclear energy, driving rapid development and deployment of new reactor technologies to meet their insatiable demand.
For years, the tech industry criticized Bitcoin's energy use. Now, the massive energy needs of AI training have forced Silicon Valley to prioritize energy abundance over purely "green" initiatives. Companies like Meta are building huge natural gas-powered data centers, a major ideological shift.
Before AI delivers long-term deflationary productivity, it requires a massive, inflationary build-out of physical infrastructure. This makes sectors like utilities, pipelines, and energy infrastructure a timely hedge against inflation and a diversifier away from concentrated tech bets.
Beyond the well-known semiconductor race, the AI competition is shifting to energy. China's massive, cheaper electricity production is a significant, often overlooked strategic advantage. This redefines the AI landscape, suggesting that superiority in atoms (energy) may become as crucial as superiority in bytes (algorithms and chips).
The AI investment case might be inverted. While tech firms spend trillions on infrastructure with uncertain returns, traditional sector companies (industrials, healthcare) can leverage powerful AI services for a fraction of the cost. They capture a massive 'value gap,' gaining productivity without the huge capital outlay.
While semiconductor access is a critical choke point, the long-term constraint on U.S. AI dominance is energy. Building massive data centers requires vast, stable power, but the U.S. faces supply chain issues for energy hardware and lacks a unified grid. China, in contrast, is strategically building out its energy infrastructure to support its AI ambitions.
Satya Nadella clarifies that the primary constraint on scaling AI compute is not the availability of GPUs, but the lack of power and physical data center infrastructure ("warm shelves") to install them. This highlights a critical, often overlooked dependency in the AI race: energy and real estate development speed.
Most of the world's energy capacity build-out over the next decade was planned using old models, completely omitting the exponential power demands of AI. This creates a looming, unpriced-in bottleneck for AI infrastructure development that will require significant new investment and planning.
The primary factor for siting new AI hubs has shifted from network routes and cheap land to the availability of stable, large-scale electricity. This creates "strategic electricity advantages" where regions with reliable grids and generation capacity are becoming the new epicenters for AI infrastructure, regardless of their prior tech hub status.