The rise of physical AI is supported by a parallel revolution in low-power microelectronics. This allows entrepreneurs to build and deploy specialized, smaller models on inexpensive hardware, bypassing the need for massive cloud resources and opening up a wave of new opportunities.
The specific AI model used is becoming as irrelevant as the specific variety of corn in a gourmet dish. The true value and differentiation lie not in the commodity model itself, but in the entire system—the agentic harnesses, workflows, and user experience—that prepares and presents the final product.
Drawing a parallel to the microservices boom, enterprises will soon deploy thousands of AI agents, creating immense operational complexity. The most valuable future products will be those that, like Datadog for microservices, provide governance, monitoring, and orchestration for this sprawling agentic workforce.
Building a business entirely on a closed-source API from a major provider like Anthropic or OpenAI is precarious. These platform companies can and do release new capabilities that directly compete with and subsume the functionalities of startups in their ecosystem, effectively erasing their business overnight.
The focus on benchmark scores for frontier models is misplaced for most practical use cases. Many applications, especially in physical and embedded AI, rely on smaller, specialized models. The small percentage point differences on abstract benchmarks have little bearing on solving a specific business problem effectively.
Meta, a long-time champion of Western open-source models with its Llama family, has pivoted to a closed-source strategy. This creates a vacuum, elevating Chinese open-source models to a dominant position and raising potential national security concerns for Western countries wary of their adoption.
