We scan new podcasts and send you the top 5 insights daily.
Drawing a parallel to Web3's 'Fat Protocols' thesis, today's large AI models are capturing the majority of value in the tech stack. As these 'fat' models become more capable, the applications built on top become 'thinner,' serving primarily as simple wrappers or marketing channels rather than creating defensible value.
Frontier models can raise more capital than the entire application layer built upon them. This unique financial power allows them to systematically expand and absorb the value of their ecosystem, a dynamic not seen in previous platforms like cloud computing.
The AI value stack has evolved from chips (NVIDIA) to models (OpenAI). The next critical phase is the application layer. It's unclear if value will be captured by new application companies or if the underlying model providers will absorb all the profits, a key question for investors and founders.
Similar to how blockchain protocols like Bitcoin and Ethereum accrued more value than the apps built on them, AI foundation models are getting 'fatter.' They are absorbing more capabilities, allowing users to perform complex tasks in a single step within the base model, reducing the need for specialized application-layer companies.
As large AI models absorb functions of traditional SaaS products, investors and entrepreneurs are shifting focus back to tech-enabled services. Integrating AI deeply into physical services and workflows is now seen as creating more defensible, lasting value than pure software, reversing a years-long trend.
Gurley notes that major AI model providers like OpenAI and Anthropic are shifting from solely selling API access to building their own applications. This move up the stack signals a fear that being a pure model provider is not a defensible moat and could lead to commoditization.
The enduring moat in the AI stack lies in what is hardest to replicate. Since building foundation models is significantly more difficult than building applications on top of them, the model layer is inherently more defensible and will naturally capture more value over time.
Comparing AI to 1995-era internet bandwidth, the hosts argue that selling raw 'intelligence' is a low-margin, commodity business. The significant financial upside will be captured not by the infrastructure providers, but by the creators who build novel applications and experiences using that intelligence as a building block.
Value in the AI stack will concentrate at the infrastructure layer (e.g., chips) and the horizontal application layer. The "middle layer" of vertical SaaS companies, whose value is primarily encoded business logic, is at risk of being commoditized by powerful, general AI agents.
Unlike software bottlenecked by engineering headcount, AI models scale with capital. A frontier model company can raise more than its entire app ecosystem combined, then use that capital to launch competitive first-party apps and subsume third-party developers.
The common critique of AI application companies as "GPT wrappers" with no moat is proving false. The best startups are evolving beyond using a single third-party model. They are using dozens of models and, crucially, are backward-integrating to build their own custom AI models optimized for their specific domain.