Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The semiconductor supply chain has extremely long lead times. Even with unprecedented demand signals for AI hardware, new memory fabrication plants ordered today will not come online until 2027 or 2028. This multi-year lag guarantees that supply bottlenecks and high prices for components like DRAM will persist.

Related Insights

The demand for HBM memory for AI is causing a global shortage because of a ~4:1 manufacturing trade-off: each bit of HBM produced consumes capacity that could have made four bits of standard DRAM. This supply crunch will raise prices for all electronics, from phones to PCs.

The primary bottleneck for increasing DRAM supply is a "clean room constraint"—a physical shortage of space in existing fabs to install new manufacturing equipment. This limitation means that even with massive investment, significant new wafer capacity is unlikely to come online meaningfully before 2028.

The AI industry's growth constraint is a swinging pendulum. While power and data center space are the current bottlenecks (2024-25), the energy supply chain is diverse. By 2027, the bottleneck will revert to semiconductor manufacturing, as leading-edge fab capacity (e.g., TSMC, HBM memory) is highly concentrated and takes years to expand.

While energy supply is a concern, the primary constraint for the AI buildout may be semiconductor fabrication. TSMC, the leading manufacturer, is hesitant to build new fabs to meet the massive demand from hyperscalers, creating a significant bottleneck that could slow down the entire industry.

For the next few years, the primary constraint on memory production is not a shortage of manufacturing equipment. Rather, it's the physical lack of clean room space. Memory companies, burned by years of low margins, failed to build new fabs, which have a two-year construction lead time.

Despite soaring AI demand, chip fab TSMC is conservatively expanding capacity. This is a rational move to avoid the catastrophic downside of overcapacity, where fixed costs sink profitability for years. However, this decision is creating a massive, predictable chip shortage for the AI industry.

The 2024-2026 AI bottleneck is power and data centers, but the energy industry is adapting with diverse solutions. By 2027, the constraint will revert to semiconductor manufacturing, as leading-edge fab capacity is highly concentrated and takes years to expand.

Despite record capital spending, TSMC's new facilities won't alleviate current AI chip supply constraints. This massive investment is for future demand (2027-2028 and beyond), forcing the company to optimize existing factories for short-term needs, highlighting the industry's long lead times.

Today's DRAM shortage stems from the post-COVID downturn. Expecting weak demand, memory producers became conservative with capital expenditures and didn't expand capacity. This left the industry unprepared for the sudden, explosive demand for memory driven by the AI boom.

Sundar Pichai identifies the critical, non-obvious constraints slowing AI's physical buildout. Beyond chips, the primary bottlenecks are fundamental wafer starts, the slow pace of regulatory permitting for new data centers, and a significant short-term shortage of high-bandwidth memory.