Despite massive investment in chips (NVIDIA) and models (OpenAI), it is not yet clear where long-term value will concentrate. The entire stack is in flux. Models could be commoditized by open source, chips could face historical commoditization cycles, and new AI-native apps could capture the most value. We are only in the early innings of a 30-year shift.

Related Insights

While investors now believe in AI's transformative power, it remains unclear who will profit most. Value could accrue to chip makers (NVIDIA), foundation models (OpenAI), or the application layer. This fundamental uncertainty is a primary driver of the significant volatility across the tech sector.

While immense value is being *created* for end-users via applications like ChatGPT, that value is primarily *accruing* to companies with deep moats in the infrastructure layer—namely hardware providers like NVIDIA and hyperscalers. The long-term defensibility of model-makers remains an open question.

Historical tech cycles like the cloud and mobile demonstrate a consistent pattern: the application layer ultimately generates 5 to 10 times the value of the underlying infrastructure capital expenditure. With trillions being invested in AI infrastructure, future value creation at the application layer will be astronomically larger.

When a new technology stack like AI emerges, the infrastructure layer (chips, networking) inflects first and has the most identifiable winners. Sacerdote argues the application and model layers are riskier and less predictable, similar to the early, chaotic days of internet search engines before Google's dominance.

Initially, the market crowned OpenAI (via proxies Nvidia/Microsoft) the definitive AI leader. Now, with Google and Anthropic achieving comparable model performance, the market is re-evaluating. This volatility shows investors moving from a "one winner" thesis to a landscape where top AI models are becoming commoditized.

Unlike prior tech cycles with a clear direction, the AI wave has a deep divide. SaaS vendors see AI enhancing existing applications, while venture capitalists bet that AI models will subsume and replace the entire SaaS application layer, creating massive disruption.

Value in the AI stack will concentrate at the infrastructure layer (e.g., chips) and the horizontal application layer. The "middle layer" of vertical SaaS companies, whose value is primarily encoded business logic, is at risk of being commoditized by powerful, general AI agents.

Conventional venture capital wisdom of 'winner-take-all' may not apply to AI applications. The market is expanding so rapidly that it can sustain multiple, fast-growing, highly valuable companies, each capturing a significant niche. For VCs, this means huge returns don't necessarily require backing a monopoly.

The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.

Contrary to the 'winner-takes-all' narrative, the rapid pace of innovation in AI is leading to a different outcome. As rival labs quickly match or exceed each other's model capabilities, the underlying Large Language Models (LLMs) risk becoming commodities, making it difficult for any single player to justify stratospheric valuations long-term.