Extreme conviction in prediction markets may not be just speculation. It could signal bets being placed by insiders with proprietary knowledge, such as developers working on AI models or administrators of the leaderboards themselves. This makes these markets a potential source of leaked alpha on who is truly ahead.
Anthropic is making its models available on AWS, Azure, and Google Cloud. This multi-cloud approach is a deliberate business strategy to position itself as a neutral infrastructure provider. Unlike competitors who might build competing apps, this signals to customers that Anthropic aims to be a partner, not a competitor.
When major infrastructure like AWS or Cloudflare goes down, it affects many companies simultaneously. This creates a collective "mulligan," meaning individual startups aren't heavily penalized by users for the downtime, as the issue is widespread. The exception is for mission-critical services like finance or live events.
Companies are becoming wary of feeding their unique data and customer queries into third-party LLMs like ChatGPT. The fear is that this trains a potential future competitor. The trend will shift towards running private, open-source models on their own cloud instances to maintain a competitive moat and ensure data privacy.
The proliferation of AI leaderboards incentivizes companies to optimize models for specific benchmarks. This creates a risk of "acing the SATs" where models excel on tests but don't necessarily make progress on solving real-world problems. This focus on gaming metrics could diverge from creating genuine user value.
The immediate threat of AI is to entry-level white-collar jobs, not senior roles. Senior staff can now use AI to perform the "grunt work" of research and drafting previously assigned to apprentices. This automates the traditional career ladder, making it harder for new talent to enter professions like law, finance, and consulting.
The future of AI isn't just in the cloud. Personal devices, like Apple's future Macs, will run sophisticated LLMs locally. This enables hyper-personalized, private AI that can index and interact with your local files, photos, and emails without sending sensitive data to third-party servers, fundamentally changing the user experience.
