The partnership where OpenAI becomes an equity holder in Thrive Holdings suggests a new go-to-market model. Instead of tech firms pushing general AI 'outside-in,' this 'inside-out' approach embeds AI development within established industry operators to build, test, and improve domain-specific models with real-world feedback loops.
Instead of selling software to traditional industries, a more defensible approach is to build vertically integrated companies. This involves acquiring or starting a business in a non-sexy industry (e.g., a law firm, hospital) and rebuilding its entire operational stack with AI at its core, something a pure software vendor cannot do.
The key for enterprises isn't integrating general AI like ChatGPT but creating "proprietary intelligence." This involves fine-tuning smaller, custom models on their unique internal data and workflows, creating a competitive moat that off-the-shelf solutions cannot replicate.
Enterprises struggle to get value from AI due to a lack of iterative, data-science expertise. The winning model for AI companies isn't just selling APIs, but embedding "forward deployment" teams of engineers and scientists to co-create solutions, closing the gap between prototype and production value.
The assumption that startups can build on frontier model APIs is temporary. Emad Mostaque predicts that once models are sufficiently capable, labs like OpenAI will cease API access and use their superior internal models to outcompete businesses in every sector, fulfilling their AGI mission.
The true enterprise value of AI lies not in consuming third-party models, but in building internal capabilities to diffuse intelligence throughout the organization. This means creating proprietary "AI factories" rather than just using external tools and admiring others' success.
OpenAI's non-profit parent retains a 26% stake (worth $130B) in its for-profit arm. This novel structure allows the organization to leverage commercial success to generate massive, long-term funding for its original, non-commercial mission, creating a powerful, self-sustaining philanthropic engine.
Initially, even OpenAI believed a single, ultimate 'model to rule them all' would emerge. This thinking has completely changed to favor a proliferation of specialized models, creating a healthier, less winner-take-all ecosystem where different models serve different needs.
Similar to how "born in the cloud" MSPs disrupted the channel ecosystem, a new category of "born in AI" partners is now emerging. These specialized firms are built from the ground up to deliver AI solutions. Legacy partners must adapt by building or acquiring AI practices to compete with these new, highly focused players.
Contrary to early narratives, a proprietary dataset is not the primary moat for AI applications. True, lasting defensibility is built by deeply integrating into an industry's ecosystem—connecting different stakeholders, leveraging strategic partnerships, and using funding velocity to build the broadest product suite.
Large companies integrate AI through three primary methods: buying third-party vendor solutions (e.g., Harvey for legal), building custom internal tools to improve efficiency, or embedding AI directly into their customer-facing products. Understanding these pathways is critical for any B2B AI startup's go-to-market strategy.