Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The AI value stack has evolved from chips (NVIDIA) to models (OpenAI). The next critical phase is the application layer. It's unclear if value will be captured by new application companies or if the underlying model providers will absorb all the profits, a key question for investors and founders.

Related Insights

While investors now believe in AI's transformative power, it remains unclear who will profit most. Value could accrue to chip makers (NVIDIA), foundation models (OpenAI), or the application layer. This fundamental uncertainty is a primary driver of the significant volatility across the tech sector.

Like containerization, AI is a transformative technology where value may accrue to customers and users, not the creators of the core infrastructure. The biggest fortunes from containerization were made by companies like Nike and Apple that leveraged global supply chains, not by investors in the container companies themselves.

While immense value is being *created* for end-users via applications like ChatGPT, that value is primarily *accruing* to companies with deep moats in the infrastructure layer—namely hardware providers like NVIDIA and hyperscalers. The long-term defensibility of model-makers remains an open question.

Historical tech cycles like the cloud and mobile demonstrate a consistent pattern: the application layer ultimately generates 5 to 10 times the value of the underlying infrastructure capital expenditure. With trillions being invested in AI infrastructure, future value creation at the application layer will be astronomically larger.

Despite massive investment in chips (NVIDIA) and models (OpenAI), it is not yet clear where long-term value will concentrate. The entire stack is in flux. Models could be commoditized by open source, chips could face historical commoditization cycles, and new AI-native apps could capture the most value. We are only in the early innings of a 30-year shift.

Comparing AI to 1995-era internet bandwidth, the hosts argue that selling raw 'intelligence' is a low-margin, commodity business. The significant financial upside will be captured not by the infrastructure providers, but by the creators who build novel applications and experiences using that intelligence as a building block.

Value in the AI stack will concentrate at the infrastructure layer (e.g., chips) and the horizontal application layer. The "middle layer" of vertical SaaS companies, whose value is primarily encoded business logic, is at risk of being commoditized by powerful, general AI agents.

Unlike software bottlenecked by engineering headcount, AI models scale with capital. A frontier model company can raise more than its entire app ecosystem combined, then use that capital to launch competitive first-party apps and subsume third-party developers.

The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.

As foundational AI models become commoditized 'intelligence utilities,' the economic value moves up the stack. Orchestrators like OpenClaw, which can intelligently route tasks to the most efficient model based on cost or use case, are positioned to capture the margin that the underlying model providers cannot.