Instead of building its own models, Razer's strategy is to be model-agnostic. It selects different best-in-class LLMs for specific use cases (Grok for conversation, ChatGPT for reasoning) and focuses its R&D on the integration layer that provides context and persistence.

Related Insights

Recognizing there is no single "best" LLM, AlphaSense built a system to test and deploy various models for different tasks. This allows them to optimize for performance and even stylistic preferences, using different models for their buy-side finance clients versus their corporate users.

The "AI wrapper" concern is mitigated by a multi-model strategy. A startup can integrate the best models from various providers for different tasks, creating a superior product. A platform like OpenAI is incentivized to only use its own models, creating a durable advantage for the startup.

Simply offering the latest model is no longer a competitive advantage. True value is created in the system built around the model—the system prompts, tools, and overall scaffolding. This 'harness' is what optimizes a model's performance for specific tasks and delivers a superior user experience.

Microsoft is not solely reliant on its OpenAI partnership. It actively integrates competitor models, such as Anthropic's, into its Copilot products to handle specific workloads where they perform better, like complex Excel tasks. This pragmatic "best tool for the job" approach diversifies its AI capabilities.

Rather than committing to a single LLM provider like OpenAI or Gemini, Hux uses multiple commercial models. They've found that different models excel at different tasks within their app. This multi-model strategy allows them to optimize for quality and latency on a per-workflow basis, avoiding a one-size-fits-all compromise.

Enterprises will shift from relying on a single large language model to using orchestration platforms. These platforms will allow them to 'hot swap' various models—including smaller, specialized ones—for different tasks within a single system, optimizing for performance, cost, and use case without being locked into one provider.

The initial AI rush for every company to build proprietary models is over. The new winning strategy, seen with firms like Adobe, is to leverage existing product distribution by integrating multiple best-in-class third-party models, enabling faster and more powerful user experiences.

The belief that a single, god-level foundation model would dominate has proven false. Horowitz points to successful AI applications like Cursor, which uses 13 different models. This shows that value lies in the complex orchestration and design at the application layer, not just in having the largest single model.

The common critique of AI application companies as "GPT wrappers" with no moat is proving false. The best startups are evolving beyond using a single third-party model. They are using dozens of models and, crucially, are backward-integrating to build their own custom AI models optimized for their specific domain.

Alexa's architecture is a model-agnostic system using over 70 different models. This allows them to use the best tool for any given task, focusing on the customer's goal rather than the underlying model brand, which is what most competitors focus on.

Razer's AI Strategy Relies on Being a Multi-Model 'Best of Breed' Integrator | RiffOn