Enterprise platform ServiceNow is offering customers access to models from both major AI labs. This "model choice" strategy directly addresses a primary enterprise fear of being locked into a single AI provider, allowing them to use the best model for each specific job.
To survive against subsidized tools from model providers like OpenAI and Anthropic, AI applications must avoid a price war. Instead, the winning strategy is to focus on superior product experience and serve as a neutral orchestration layer that allows users to choose the best underlying model.
Recognizing there is no single "best" LLM, AlphaSense built a system to test and deploy various models for different tasks. This allows them to optimize for performance and even stylistic preferences, using different models for their buy-side finance clients versus their corporate users.
Instead of standardizing on one LLM or coding assistant, Brex offers licenses for several competing options. This employee choice provides clear usage data, giving Brex leverage to resist wall-to-wall deployments and negotiate better vendor contracts.
The "AI wrapper" concern is mitigated by a multi-model strategy. A startup can integrate the best models from various providers for different tasks, creating a superior product. A platform like OpenAI is incentivized to only use its own models, creating a durable advantage for the startup.
Microsoft is not solely reliant on its OpenAI partnership. It actively integrates competitor models, such as Anthropic's, into its Copilot products to handle specific workloads where they perform better, like complex Excel tasks. This pragmatic "best tool for the job" approach diversifies its AI capabilities.
Rather than committing to a single LLM provider like OpenAI or Gemini, Hux uses multiple commercial models. They've found that different models excel at different tasks within their app. This multi-model strategy allows them to optimize for quality and latency on a per-workflow basis, avoiding a one-size-fits-all compromise.
Enterprises will shift from relying on a single large language model to using orchestration platforms. These platforms will allow them to 'hot swap' various models—including smaller, specialized ones—for different tasks within a single system, optimizing for performance, cost, and use case without being locked into one provider.
Instead of an exclusive AI partner, Apple could offer a choice of AI agents (OpenAI, Anthropic, etc.) on setup, similar to the EU's browser choice screen. This would create a competitive marketplace for AI assistants on billions of devices, driving significant investment and innovation across the industry.
Like Kayak for flights, being a model aggregator provides superior value to users who want access to the best tool for a specific job. Big tech companies are restricted to their own models, creating an opportunity for startups to win by offering a 'single pane of glass' across all available models.
OpenAI's partnership with ServiceNow isn't about building a competing product; it's about embedding its "agentic" AI directly into established platforms. This strategy focuses on becoming the core intelligence layer for existing enterprise systems, allowing AI to act as an automated teammate within familiar workflows.