Apple is letting rivals like Google spend billions on building AI infrastructure. Apple's plan is to then license the winning large language models for cheap and integrate them into its massive ecosystem of 2.5 billion devices, leveraging its distribution power without the immense capital expenditure.

Related Insights

Unlike competitors feeling pressure to build proprietary AI foundation models, Apple can simply partner with providers like Google. This reveals Apple's true moat isn't the model itself but its massive hardware distribution network, giving it leverage to integrate best-in-class AI without the high cost of in-house development.

Currently, Apple pays Google for search defaults. The hosts predict this will reverse for AI. As inference costs drop and monetization (via ads, affiliate fees, transactions) improves, LLM queries will become profitable on average, making access to Apple's users a revenue stream worth paying for.

Apple is deliberately avoiding the massive, capital-intensive data center build-out pursued by its rivals. The company is betting that a more measured approach, relying on partners and on-device processing, will appear strategically brilliant as the market questions the sustainability of the AI infrastructure gold rush.

Apple's seemingly slow AI progress is likely a strategic bet that today's powerful cloud-based models will become efficient enough to run locally on devices within 12 months. This would allow them to offer powerful AI with superior privacy, potentially leapfrogging competitors.

Apple isn't trying to build the next frontier AI model. Instead, their strategy is to become the primary distribution channel by compressing and running competitors' state-of-the-art models directly on devices. This play leverages their hardware ecosystem to offer superior privacy and performance.

Apple is avoiding massive capital expenditure on building its own LLMs. By partnering with a leader like Google for the underlying tech (e.g., Gemini for Siri), Apple can focus on its core strength: productizing and integrating technology into a superior user experience, which may be the more profitable long-term play.

In a major strategic move, Apple is white-labeling Google's Gemini model to power the upcoming, revamped Siri. Apple will pay Google for this underlying technology, a tacit admission that its in-house models are not yet competitive. This partnership aims to fix Siri's long-standing performance issues without publicly advertising its reliance on a competitor.

Unlike search, where Apple charges Google $20B for access, Apple is reportedly paying to use Google's Gemini AI. This reversal shows that elite AI technology currently holds more leverage than even Apple's massive user base.

While critics say Apple "missed AI," its strategy of partnering with Google for Gemini is a masterstroke. Apple avoids billions in CapEx, sidesteps brand-damaging AI controversies, and maintains control over the lucrative user interface, positioning itself to win the "agent of commerce" war.

By licensing Google's Gemini for Siri, Apple is strategically avoiding the capital-intensive foundation model war. This allows them to focus resources on their core strength: silicon and on-device AI. The long-term vision is a future where Apple dominates the "edge," interoperating with cloud AIs.