Instead of standardizing on one LLM or coding assistant, Brex offers licenses for several competing options. This employee choice provides clear usage data, giving Brex leverage to resist wall-to-wall deployments and negotiate better vendor contracts.

Related Insights

LLMs are becoming commoditized. Like gas from different stations, models can be swapped based on price or marginal performance. This means competitive advantage doesn't come from the model itself, but how you use it with proprietary data.

Recognizing there is no single "best" LLM, AlphaSense built a system to test and deploy various models for different tasks. This allows them to optimize for performance and even stylistic preferences, using different models for their buy-side finance clients versus their corporate users.

Traditional SaaS switching costs were based on painful data migrations, which LLMs may now automate. The new moat for AI companies is creating deep, customized integrations into a customer's unique operational workflows. This is achieved through long, hands-on pilot periods that make the AI solution indispensable and hard to replace.

Microsoft is not solely reliant on its OpenAI partnership. It actively integrates competitor models, such as Anthropic's, into its Copilot products to handle specific workloads where they perform better, like complex Excel tasks. This pragmatic "best tool for the job" approach diversifies its AI capabilities.

AI agent platforms are typically priced by usage, not seats, making initial costs low. Instead of a top-down mandate for one tool, leaders should encourage teams to expense and experiment with several options. The best solution for the team will emerge organically through use.

Rather than committing to a single LLM provider like OpenAI or Gemini, Hux uses multiple commercial models. They've found that different models excel at different tasks within their app. This multi-model strategy allows them to optimize for quality and latency on a per-workflow basis, avoiding a one-size-fits-all compromise.

When AI startups demand access to your platform's data via API, turn the tables. Gate your APIs and, during negotiations, agree to their request on the condition that you get reciprocal access to the AI outputs they generate from your data. This reframes the power dynamic and protects your moat.

Resource-constrained startups are forgoing traditional hires like lawyers, instead using LLMs to analyze legal documents, identify unfavorable terms, and generate negotiation counter-arguments, saving significant legal fees in their first years.

Brex organizes its AI efforts into three pillars: buying tools for internal efficiency (Corporate), building/buying to reduce operational costs (Operational), and creating AI products that become part of their customers' own AI strategies (Product).

Brex formed a small, centralized AI team by asking, "What would a company founded today to disrupt Brex look like?" This team operates with the speed and focus of a startup, separate from the main engineering org to avoid corporate inertia.