In a crowded market where startups offer free or heavily subsidized AI tokens to gain users, Vercel intentionally prices its tokens at cost. They reject undercutting the market, betting instead that a superior, higher-quality product will win customers willing to pay for value.

Related Insights

As AI makes it easy to generate 'good enough' software, a functional product is no longer a moat. The new advantage is creating an experience so delightful that users prefer it over a custom-built alternative. This makes design the primary driver of value, setting premium software apart from the infinitely generated.

Tech giants like Google and Meta are positioned to offer their premium AI models for free, leveraging their massive ad-based business models. This strategy aims to cut off OpenAI's primary revenue stream from $20/month subscriptions. For incumbents, subsidizing AI is a strategic play to acquire users and boost market capitalization.

In the fast-evolving AI space, Vercel's AISDK deliberately remained low-level. CTO Malte Ubl explains that because "we know absolutely nothing" about future AI app patterns, providing a flexible, minimal toolkit was superior to competitors' rigid, high-level frameworks that made incorrect assumptions about user needs.

OpenPipe's founder felt pressure from frontier labs continually lowering token prices, which eroded their value prop. However, competition from GPU providers never materialized because their fine-tuning services were too difficult to use, highlighting the persistent value of good developer experience.

Standard SaaS pricing fails for agentic products because high usage becomes a cost center. Avoid the trap of profiting from non-use. Instead, implement a hybrid model with a fixed base and usage-based overages, or, ideally, tie pricing directly to measurable outcomes generated by the AI.

AI companies operate under the assumption that LLM prices will trend towards zero. This strategic bet means they intentionally de-prioritize heavy investment in cost optimization today, focusing instead on capturing the market and building features, confident that future, cheaper models will solve their margin problems for them.

As the current low-cost producer of AI tokens via its custom TPUs, Google's rational strategy is to operate at low or even negative margins. This "sucks the economic oxygen out of the AI ecosystem," making it difficult for capital-dependent competitors to justify their high costs and raise new funding rounds.

Unlike SaaS, where high gross margins are key, an AI company with very high margins likely isn't seeing significant use of its core AI features. Low margins signal that customers are actively using compute-intensive products, a positive early indicator.

The AI value chain flows from hardware (NVIDIA) to apps, with LLM providers currently capturing most of the margin. The long-term viability of app-layer businesses depends on a competitive model layer. This competition drives down API costs, preventing model providers from having excessive pricing power and allowing apps to build sustainable businesses.

In a world where AI makes software cheap or free, the primary value shifts to specialized human expertise. Companies can monetize by using their software as a low-cost distribution channel to sell high-margin, high-ticket services that customers cannot easily replicate, like specialized security analysis.