Users' entire personal lives—communications, files, locations—are stored in iMessage. This makes it a "system of record" that new platforms like AI assistants or smart glasses must integrate with to be useful, giving Apple a massive competitive advantage.
Unlike competitors feeling pressure to build proprietary AI foundation models, Apple can simply partner with providers like Google. This reveals Apple's true moat isn't the model itself but its massive hardware distribution network, giving it leverage to integrate best-in-class AI without the high cost of in-house development.
Apple isn't trying to build the next frontier AI model. Instead, their strategy is to become the primary distribution channel by compressing and running competitors' state-of-the-art models directly on devices. This play leverages their hardware ecosystem to offer superior privacy and performance.
The killer feature for AI assistants isn't just answering abstract queries, but deeply integrating with user data. The ability for Gemini to analyze your unread emails to identify patterns and suggest improvements provides immediate, tangible value, showcasing the advantage of AI embedded in existing productivity ecosystems.
As AI makes building software features trivial, the sustainable competitive advantage shifts to data. A true data moat uses proprietary customer interaction data to train AI models, creating a feedback loop that continuously improves the product faster than competitors.
Instead of helping users draft messages, the true evolution of communication is AI agents negotiating tasks like scheduling meetings directly with other agents. This bypasses the need for manual back-and-forth in apps like iMessage.
Creating a basic AI coding tool is easy. The defensible moat comes from building a vertically integrated platform with its own backend infrastructure like databases, user management, and integrations. This is extremely difficult for competitors to replicate, especially if they rely on third-party services like Superbase.
The future of AI isn't just in the cloud. Personal devices, like Apple's future Macs, will run sophisticated LLMs locally. This enables hyper-personalized, private AI that can index and interact with your local files, photos, and emails without sending sensitive data to third-party servers, fundamentally changing the user experience.
OpenAI's platform strategy, which centralizes app distribution through ChatGPT, mirrors Apple's iOS model. This creates a 'walled garden' that could follow Cory Doctorow's 'inshittification' pattern: initially benefiting users, then locking them in, and finally exploiting them once they cannot easily leave the ecosystem.
By licensing Google's Gemini for Siri, Apple is strategically avoiding the capital-intensive foundation model war. This allows them to focus resources on their core strength: silicon and on-device AI. The long-term vision is a future where Apple dominates the "edge," interoperating with cloud AIs.
While personal history in an AI like ChatGPT seems to create lock-in, it is a weaker moat than for media platforms like Google Photos. Text-based context and preferences are relatively easy to export and transfer to a competitor via another LLM, reducing switching friction.