An emerging AI growth strategy involves using expensive frontier models to acquire users and distribution at an explosive rate, accepting poor initial margins. Once critical mass is reached, the company introduces its own fine-tuned, cheaper model, drastically improving unit economics overnight and capitalizing on the established user base.

Related Insights

Box CEO Aaron Levie advises against building complex workarounds for the limitations of cheaper, older AI models. This "scaffolding" becomes obsolete with each new model release. To stay competitive, companies must absorb the cost of using the best available model, as competitors will certainly do so.

Established SaaS firms avoid AI-native products because they operate at lower gross margins (e.g., 40%) compared to traditional software (80%+). This parallels brick-and-mortar retail's fatal hesitation with e-commerce, creating an opportunity for AI-native startups to capture the market by embracing different unit economics.

Pre-reasoning AI models were static assets that depreciated quickly. The advent of reasoning allows models to learn from user interactions, re-establishing the classic internet flywheel: more usage generates data that improves the product, which attracts more users. This creates a powerful, compounding advantage for the leading labs.

AI is creating a fork in marketing strategy. It disrupts traditional demand acquisition channels like search, making it harder and more expensive to get measurable traffic. Simultaneously, it provides powerful new tools to monetize existing demand more effectively. This forces a strategic shift from a volume-based to a value-extraction model.

AI labs like Anthropic find that mid-tier models can be trained with reinforcement learning to outperform their largest, most expensive models in just a few months, accelerating the pace of capability improvements.

In the current market, AI companies see explosive growth through two primary vectors: attaching to the massive AI compute spend or directly replacing human labor. Companies merely using AI to improve an existing product without hitting one of these drivers risk being discounted as they lack a clear, exponential growth narrative.

AI companies operate under the assumption that LLM prices will trend towards zero. This strategic bet means they intentionally de-prioritize heavy investment in cost optimization today, focusing instead on capturing the market and building features, confident that future, cheaper models will solve their margin problems for them.

Unlike SaaS, where high gross margins are key, an AI company with very high margins likely isn't seeing significant use of its core AI features. Low margins signal that customers are actively using compute-intensive products, a positive early indicator.

Many AI startups prioritize growth, leading to unsustainable gross margins (below 15%) due to high compute costs. This is a ticking time bomb. Eventually, these companies must undertake a costly, time-consuming re-architecture to optimize for cost and build a viable business.

Instead of offering a model selector, creating a proprietary, branded model allows a company to chain different specialized models for various sub-tasks (e.g., search, generation). This not only improves overall performance but also provides business independence from the pricing and launch cycles of a single frontier model lab.

The AI 'Bait and Switch' Growth Model | RiffOn