Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

OpenAI leveraged its massive demand for compute to secure warrants for a potential 11% stake in chipmaker Cerebrus for a fraction of a penny per share. This deal, tied to a $20 billion multi-year purchase commitment, highlights the immense bargaining power held by major AI model developers over their supply chain.

Related Insights

Facing Nvidia's near-total capture of AI data center revenue growth since 2022, AMD CEO Lisa Su made a "bet the farm" move. By granting OpenAI warrants for up to 10% of AMD, she aims to secure a critical design win for their next-gen chip, validating it as a viable competitor to Nvidia.

OpenAI's investment in custom silicon is not just about performance; it's a strategic move to reduce dependency on hardware suppliers like Nvidia, AMD, and AWS. Owning its own hardware stack provides crucial negotiating leverage, potentially lowering long-term costs even if the chip itself faces near-term hurdles.

OpenAI isn't just buying chips from Cerebras; it's financing data centers and taking warrants. This strategy de-risks the supplier and secures long-term compute access, creating a new partnership model for capital-intensive AI development that goes beyond simple procurement.

To secure a foundational customer like OpenAI, capital-intensive infrastructure startups like Cerebrus may have to offer extremely generous terms, including massive, near-free equity stakes. This "deal they had to take" dynamic is necessary to overcome the cold start problem and achieve scale, demonstrating the immense leverage held by large AI model companies.

Despite losing money, OpenAI leveraged its massive user base to secure warrants for 10% of AMD. This contrasts with NVIDIA, who received equity in OpenAI, showcasing how user control dictates power in strategic partnerships, even with hardware giants.

OpenAI's compute deal with Cerebras, alongside deals with AMD and Nvidia, shows that hyperscalers are aggressively diversifying their AI chip supply. This creates a massive opportunity for smaller, specialized silicon teams, heralding a new competitive era reminiscent of the PC wars.

NVIDIA funds OpenAI's compute purchases (of NVIDIA chips) with an equity investment. This effectively gives OpenAI a discount without lowering market prices, while NVIDIA gains equity in a key customer and locks in massive sales.

While training has been the focus, user experience and revenue happen at inference. OpenAI's massive deal with chip startup Cerebrus is for faster inference, showing that response time is a critical competitive vector that determines if AI becomes utility infrastructure or remains a novelty.

OpenAI's deal structures highlight the market's perception of chip providers. NVIDIA commanded a direct investment from OpenAI to secure its chips (a premium). In contrast, AMD had to offer equity warrants to OpenAI to win its business (a discount), reflecting their relative negotiating power.

OpenAI's deals with suppliers like Cerebrus and CoreWeave involve taking significant equity stakes in exchange for large purchase commitments. This strategy effectively turns OpenAI into a powerful venture capital entity, securing its supply chain while also building a valuable investment portfolio at an incredibly low cost basis.