Unlike software bottlenecked by engineering headcount, AI models scale with capital. A frontier model company can raise more than its entire app ecosystem combined, then use that capital to launch competitive first-party apps and subsume third-party developers.
Unlike traditional SaaS where a bootstrapped company could eventually catch up to funded rivals, the AI landscape is different. The high, ongoing cost of talent and compute means an early capital advantage becomes a permanent, widening moat, making it nearly impossible for capital-light players to compete.
Creating frontier AI models is incredibly expensive, yet their value depreciates rapidly as they are quickly copied or replicated by lower-cost open-source alternatives. This forces model providers to evolve into more defensible application companies to survive.
Frontier models can raise more capital than the entire application layer built upon them. This unique financial power allows them to systematically expand and absorb the value of their ecosystem, a dynamic not seen in previous platforms like cloud computing.
The enduring moat in the AI stack lies in what is hardest to replicate. Since building foundation models is significantly more difficult than building applications on top of them, the model layer is inherently more defensible and will naturally capture more value over time.
Fears of a single AI company achieving runaway dominance are proving unfounded, as the number of frontier models has tripled in a year. Newcomers can use techniques like synthetic data generation to effectively "drink the milkshake" of incumbents, reverse-engineering their intelligence at lower costs.
While a current AI model may be gross-margin positive on inference, the company is not. The staggering cost of training the *next* model makes them gross-margin negative overall. Their business model relies on raising ever-larger rounds to fund R&D, a potentially unsustainable cycle.
AI-native companies grow so rapidly that their cost to acquire an incremental dollar of ARR is four times lower than traditional SaaS at the $100M scale. This superior burn multiple makes them more attractive to VCs, even with higher operational costs from tokens.
Unlike traditional software, AI model companies can convert capital directly into a better product via compute. This creates a rapid fundraising-to-growth cycle, where money produces a superior model with a small team, generating immediate demand and fueling the next, larger round.
The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.
Contrary to early narratives, a proprietary dataset is not the primary moat for AI applications. True, lasting defensibility is built by deeply integrating into an industry's ecosystem—connecting different stakeholders, leveraging strategic partnerships, and using funding velocity to build the broadest product suite.