While an AI bubble seems negative, the overproduction of compute power creates a favorable environment for companies that consume it. As prices for compute drop, their cost of goods sold decreases, leading to higher gross margins and better business fundamentals.

Related Insights

Rabois argues that unlike foundational model or infrastructure plays, AI application startups shouldn't need to burn cash on compute. He believes they should be able to pass these costs through to customers and demonstrate healthy unit economics immediately.

The race to build power infrastructure for AI may lead to an oversupply if adoption follows a sigmoid curve. This excess capacity, much like the post-dot-com broadband glut, could become a positive externality that significantly lowers future energy prices for all consumers.

The massive capital expenditure in AI is largely confined to the "superintelligence quest" camp, which bets on godlike AI transforming the economy. Companies focused on applying current AI to create immediate economic value are not necessarily in a bubble.

Contrary to the idea that technology always gets cheaper, building on AI is less expensive now. The current phase is characterized by abundant venture capital and intense competition among AI tool providers, which subsidizes costs for developers. As the market consolidates, these costs will rise.

Current AI spending appears bubble-like, but it's not propping up unprofitable operations. Inference is already profitable. The immense cash burn is a deliberate, forward-looking investment in developing future, more powerful models, not a sign of a failing business model. This re-frames the financial risk.

The current AI investment boom is focused on massive infrastructure build-outs. A counterintuitive threat to this trade is not that AI fails, but that it becomes more compute-efficient. This would reduce infrastructure demand, deflating the hardware bubble even as AI proves economically valuable.

The common goal of increasing AI model efficiency could have a paradoxical outcome. If AI performance becomes radically cheaper ("too cheap to meter"), it could devalue the massive investments in compute and data center infrastructure, creating a financial crisis for the very companies that enabled the boom.

Unlike SaaS, where high gross margins are key, an AI company with very high margins likely isn't seeing significant use of its core AI features. Low margins signal that customers are actively using compute-intensive products, a positive early indicator.

Many AI startups prioritize growth, leading to unsustainable gross margins (below 15%) due to high compute costs. This is a ticking time bomb. Eventually, these companies must undertake a costly, time-consuming re-architecture to optimize for cost and build a viable business.

The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.

An AI Compute Bubble Actually Benefits Application-Layer Startups by Lowering Costs | RiffOn