Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The AI industry's massive demand for HBM memory is creating a severe shortage and price tripling for consumer DRAM. This will make devices like iPhones hundreds of dollars more expensive and is projected to cut the low and mid-range smartphone market in half as manufacturers cannot absorb the costs.

Related Insights

The demand for HBM memory for AI is causing a global shortage because of a ~4:1 manufacturing trade-off: each bit of HBM produced consumes capacity that could have made four bits of standard DRAM. This supply crunch will raise prices for all electronics, from phones to PCs.

Unlike past cycles driven solely by new demand (e.g., mobile phones), the current AI memory super cycle is different. The new demand driver, HBM, actively constrains the supply of traditional DRAM by competing for the same limited wafer capacity, intensifying and prolonging the shortage.

The memory shortage is forcing real-world consequences as consumer electronics firms are already raising PC prices (Dell, Lenovo) and cutting smartphone sales forecasts (MediaTek). Companies are also delaying new product launches to avoid passing on higher component costs to consumers.

OpenAI is buying 3-4 times more memory than it needs for short-term operations. While this could be aggressive future-proofing, a less charitable view suggests a strategic move to corner the DRAM supply, artificially inflating costs and killing the nascent on-device AI market before it can compete.

An analyst claims OpenAI is buying 3-4 times more memory than it currently needs. Beyond aggressive planning, this could be a strategic play to corner the global memory supply. This would artificially constrain competitors, particularly those focused on on-device AI, by making a critical component scarce and expensive.

Producing specialized High-Bandwidth Memory (HBM) for AI is wafer-intensive, yielding only a third of the memory bits per wafer compared to standard DRAM. As makers shift capacity to profitable HBM, they directly reduce the supply available for consumer electronics, creating a severe shortage.

In a surprising market inversion, the price surge for commodity DRAM has become so extreme that its profit margins now exceed those of specialized High-Bandwidth Memory (HBM). This creates a strategic dilemma for producers, forcing them to balance short-term profits against long-term AI market position.

Despite record profits driven by AI demand for High-Bandwidth Memory, chip makers are maintaining a "conservative investment approach" and not rapidly expanding capacity. This strategic restraint keeps prices for critical components high, maximizing their profitability and effectively controlling the pace of the entire AI hardware industry.

The intense demand for memory chips for AI is causing a shortage so severe that NVIDIA is delaying a new gaming GPU for the first time in 30 years. This demonstrates a major inflection point where the AI industry's hardware needs are creating significant, tangible ripple effects on adjacent, multi-billion dollar consumer markets.

Today's DRAM shortage stems from the post-COVID downturn. Expecting weak demand, memory producers became conservative with capital expenditures and didn't expand capacity. This left the industry unprepared for the sudden, explosive demand for memory driven by the AI boom.