Glean spent years solving unsexy enterprise search problems before the AI boom. This deep, unglamorous work, often dismissed in the current narrative that credits AI for its success, became its key competitive advantage when the category became popular.
The founders initially feared their data collection hardware would be easily copied. However, they discovered the true challenge and defensible moat lay in scaling the full-stack system—integrating hardware iterations, data pipelines, and training loops. The unexpected difficulty of this process created a powerful competitive advantage.
The notion of building a business as a 'thin wrapper' around a foundational model like GPT is flawed. Truly defensible AI products, like Cursor, build numerous specific, fine-tuned models to deeply understand a user's domain. This creates a data and performance moat that a generic model cannot easily replicate, much like Salesforce was more than just a 'thin wrapper' on a database.
Incumbent companies are slowed by the need to retrofit AI into existing processes and tribal knowledge. AI-native startups, however, can build their entire operational model around agent-based, prompt-driven workflows from day one, creating a structural advantage that is difficult for larger companies to copy.
Startups like Cognition Labs find their edge not by competing on pre-training large models, but by mastering post-training. They build specialized reinforcement learning environments that teach models specific, real-world workflows (e.g., using Datadog for debugging), creating a defensible niche that larger players overlook.
Instead of building AI models, a company can create immense value by being 'AI adjacent'. The strategy is to focus on enabling good AI by solving the foundational 'garbage in, garbage out' problem. Providing high-quality, complete, and well-understood data is a critical and defensible niche in the AI value chain.
When asked if AI commoditizes software, Bravo argues that durable moats aren't just code, which can be replicated. They are the deep understanding of customer processes and the ability to service them. This involves re-engineering organizations, not just deploying a product.
The enduring moat in the AI stack lies in what is hardest to replicate. Since building foundation models is significantly more difficult than building applications on top of them, the model layer is inherently more defensible and will naturally capture more value over time.
Creating a basic AI coding tool is easy. The defensible moat comes from building a vertically integrated platform with its own backend infrastructure like databases, user management, and integrations. This is extremely difficult for competitors to replicate, especially if they rely on third-party services like Superbase.
A key competitive advantage wasn't just the user network, but the sophisticated internal tools built for the operations team. Investing early in a flexible, 'drag-and-drop' system for creating complex AI training tasks allowed them to pivot quickly and meet diverse client needs, a capability competitors lacked.
Contrary to the belief that distribution is the new moat, the crucial differentiator in AI is talent. Building a truly exceptional AI product is incredibly nuanced and complex, requiring a rare skill set. The scarcity of people who can build off models in an intelligent, tasteful way is the real technological moat, not just access to data or customers.