Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Foundation models like OpenAI won't dominate the enterprise application layer. Similar to how AWS became infrastructure for a software explosion, LLMs will do the same for AI apps. Their core business and GTM motion is fundamentally different from what's required to sell complex enterprise solutions.

Related Insights

The fear that large AI labs will dominate all software is overblown. The competitive landscape will likely mirror Google's history: winning in some verticals (Maps, Email) while losing in others (Social, Chat). Victory will be determined by superior team execution within each specific product category, not by the sheer power of the underlying foundation model.

Counter to fears that foundation models will obsolete all apps, AI startups can build defensible businesses by embedding AI into unique workflows, owning the customer relationship, and creating network effects. This mirrors how top App Store apps succeeded despite Apple's platform dominance.

The ease of building applications on top of powerful LLMs will lead companies to create their own custom software instead of buying third-party SaaS products. This shift, combined with the risk of foundation models moving up the stack, signals the end of the traditional SaaS era.

Unlike sticky cloud infrastructure (AWS, GCP), LLMs are easily interchangeable via APIs, leading to customer "promiscuity." This commoditizes the model layer and forces providers like OpenAI to build defensible moats at the application layer (e.g., ChatGPT) where they can own the end user.

The enduring moat in the AI stack lies in what is hardest to replicate. Since building foundation models is significantly more difficult than building applications on top of them, the model layer is inherently more defensible and will naturally capture more value over time.

Enterprises will shift from relying on a single large language model to using orchestration platforms. These platforms will allow them to 'hot swap' various models—including smaller, specialized ones—for different tasks within a single system, optimizing for performance, cost, and use case without being locked into one provider.

With model improvements showing diminishing returns and competitors like Google achieving parity, OpenAI is shifting focus to enterprise applications. The strategic battleground is moving from foundational model superiority to practical, valuable productization for businesses.

While AI labs could build competing enterprise apps, the required effort (sales teams, customizations) is massive. For a multi-billion dollar company, the resulting revenue is a rounding error, making it an illogical distraction from their core model-building business.

The fundamental shift from AI isn't about replacing foundational model companies like OpenAI. Instead, AI creates a new technological substrate—productized intelligence—that will engender an entirely new breed of software companies, marking the end of the traditional SaaS playbook.

The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.