Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Instead of building its own costly large language model, Apple could leverage its powerful distribution by auctioning off the default AI assistant role on its devices. This would mirror its lucrative deal with Google for search, creating a massive new revenue stream without the R&D risk.

Related Insights

Unlike competitors feeling pressure to build proprietary AI foundation models, Apple can simply partner with providers like Google. This reveals Apple's true moat isn't the model itself but its massive hardware distribution network, giving it leverage to integrate best-in-class AI without the high cost of in-house development.

While tech giants' capital expenditures skyrocket to fund AI development, Apple's has declined. The company strategically sidesteps the costly race to build foundation models by partnering with Google. It will integrate Gemini into its products, letting Google bear the immense infrastructure and training costs.

Currently, Apple pays Google for search defaults. The hosts predict this will reverse for AI. As inference costs drop and monetization (via ads, affiliate fees, transactions) improves, LLM queries will become profitable on average, making access to Apple's users a revenue stream worth paying for.

While other tech giants are massively increasing capital expenditures to build AI data centers, Apple's CapEx is down. This reveals a deliberate strategy to avoid the high costs of training foundation models by integrating third-party AI, like Google's Gemini, into its products.

Apple is letting rivals like Google spend billions on building AI infrastructure. Apple's plan is to then license the winning large language models for cheap and integrate them into its massive ecosystem of 2.5 billion devices, leveraging its distribution power without the immense capital expenditure.

Apple's dominant hardware and App Store ecosystem allow it to generate over $1B in annual revenue from AI app fees. This strategy outsources the massive capex and R&D risk to AI labs like OpenAI, creating a high-margin business while they refine their own on-device AI plan.

Apple is avoiding massive capital expenditure on building its own LLMs. By partnering with a leader like Google for the underlying tech (e.g., Gemini for Siri), Apple can focus on its core strength: productizing and integrating technology into a superior user experience, which may be the more profitable long-term play.

Instead of an exclusive AI partner, Apple could offer a choice of AI agents (OpenAI, Anthropic, etc.) on setup, similar to the EU's browser choice screen. This would create a competitive marketplace for AI assistants on billions of devices, driving significant investment and innovation across the industry.

By allowing third-party AI assistants to integrate with Siri, Apple isn't just conceding its AI lag. This strategy aims to capture a share of AI subscription revenue through the App Store and preemptively address antitrust concerns, mirroring its approach with search engines in Safari.

Apple is successfully navigating the AI race by avoiding the massive expense of building foundational models. Instead, it's partnering with companies like Google for AI capabilities while focusing on its core strength: selling high-margin hardware. This allows Apple to capture the end-user without the costly infrastructure build-out of its rivals.

Apple May Treat Generative AI Like Search and Auction Off the Default Position | RiffOn