Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The AI supply crunch extends beyond advanced processors. The industry faces critical shortages of basic components like electrical transformers and switches, with lead times stretching three to five years. This creates a less obvious but significant bottleneck for building the necessary data center infrastructure.

Related Insights

The AI supply chain is crunched not just by obvious components like TSMC wafers and HBM memory. A significant, often overlooked bottleneck is rack manufacturing—including high-speed cables, connectors, and even sheet metal—which are "sneaky hard" due to extreme power, heat, and signal integrity demands.

The primary bottleneck for scaling AI over the next decade may be the difficulty of bringing gigawatt-scale power online to support data centers. Smart money is already focused on this challenge, which is more complex than silicon supply.

Building AI data centers or nuclear plants is pointless without the massive transformers needed to connect them to the grid. With lead times of 4-5 years for these components, which rely on Chinese rare earths, this hardware bottleneck is the critical constraint on energy and AI infrastructure expansion.

While the world focused on GPU shortages, the real constraint on AI compute is now physical infrastructure. The bottleneck has moved to accessing power, building data centers, and finding specialized labor like electricians and acquiring basic materials like structural steel. Merely acquiring chips is no longer enough to scale.

While GPUs dominated headlines, the most significant bottleneck in scaling AI data centers was 100-year-old power transformer technology. With lead times stretching over three years and costs surging 150%, connecting new data centers to the grid became the primary constraint on the AI buildout.

The primary constraint on powering new AI data centers over the next 2-3 years isn't the energy source itself (like natural gas), but a physical hardware bottleneck. There is a multi-year manufacturing backlog for the specialized gas turbines required to generate power on-site, with only a few global suppliers.

The race to build AI infrastructure was constrained not by advanced semiconductors, but by the availability of power transformers. This overlooked, 100-year-old technology saw lead times balloon to over three years, becoming the single biggest gating factor for new data center deployments.

While NVIDIA may solve the chip shortage, the true limiting factors for AI's growth are physical-world constraints. The US currently lacks sufficient electricity, rare earth minerals, manufacturing capacity, and even power transformers to support the massive, energy-intensive demands of AI.

Even if NVIDIA and TSMC solve wafer shortages, the AI industry faces a looming energy (watt) bottleneck. The inability to power new data centers could cap AI growth, shifting the primary constraint from semiconductor manufacturing to energy infrastructure and supply.

Public announcements for massive new data centers may be "pollyannish." The reality is constrained by long lead times for critical hardware components like power generators (24 months) and transformers. This supply chain friction could significantly delay or derail ambitious AI infrastructure projects, regardless of stated demand.