Box CEO Aaron Levie advises against building complex workarounds for the limitations of cheaper, older AI models. This "scaffolding" becomes obsolete with each new model release. To stay competitive, companies must absorb the cost of using the best available model, as competitors will certainly do so.

Related Insights

Overly structured, workflow-based systems that work with today's models will become bottlenecks tomorrow. Engineers must be prepared to shed abstractions and rebuild simpler, more general systems to capture the gains from exponentially improving models.

The "AI wrapper" concern is mitigated by a multi-model strategy. A startup can integrate the best models from various providers for different tasks, creating a superior product. A platform like OpenAI is incentivized to only use its own models, creating a durable advantage for the startup.

Simply offering the latest model is no longer a competitive advantage. True value is created in the system built around the model—the system prompts, tools, and overall scaffolding. This 'harness' is what optimizes a model's performance for specific tasks and delivers a superior user experience.

Features built to guide AI agents, like an explicit "plan mode," will become obsolete as models become more capable. The Claude Code team embraces this, building what's needed for the best current experience and fully expecting to delete that code when a new model renders it unnecessary.

In the fast-paced world of AI, focusing only on the limitations of current models is a failing strategy. GitHub's CPO advises product teams to design for the future capabilities they anticipate. This ensures that when a more powerful model drops, the product experience can be rapidly upgraded to its full potential.

AI companies operate under the assumption that LLM prices will trend towards zero. This strategic bet means they intentionally de-prioritize heavy investment in cost optimization today, focusing instead on capturing the market and building features, confident that future, cheaper models will solve their margin problems for them.

The enduring moat in the AI stack lies in what is hardest to replicate. Since building foundation models is significantly more difficult than building applications on top of them, the model layer is inherently more defensible and will naturally capture more value over time.

The Browser Company's Dia browser was built with the conviction that AI models would rapidly improve. Core features like "memory" were impossible, killed, and then revived just before launch when a new model suddenly unlocked the capability, validating their forward-looking bet on the technology's trajectory.

An AI tool's quality is now almost entirely dependent on its underlying model. The guest notes that 'Windsor', a top-tier agent just three weeks prior, dropped to 'C-tier' simply because it hadn't integrated Claude 4, highlighting the brutal pace of innovation.

Instead of offering a model selector, creating a proprietary, branded model allows a company to chain different specialized models for various sub-tasks (e.g., search, generation). This not only improves overall performance but also provides business independence from the pricing and launch cycles of a single frontier model lab.