Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

Greg Brockman simplifies OpenAI's business to its most fundamental level: buying or building massive amounts of compute and reselling it with an intelligence layer on top. This framing reveals that their primary growth vector and constraint is access to computation, making their core operation a margin-based resale of processing power.

Related Insights

Firms like OpenAI and Meta claim a compute shortage while also exploring selling compute capacity. This isn't a contradiction but a strategic evolution. They are buying all available supply to secure their own needs and then arbitraging the excess, effectively becoming smaller-scale cloud providers for AI.

Anthropic's capital efficiency in model training has been impressive. However, OpenAI's willingness to spend massively on compute could become a decisive advantage. As user demand outstrips supply, reliable service capacity—not just model quality—may become the key differentiator and competitive moat.

Unlike traditional software, OpenAI's growth is limited by a zero-sum resource: GPUs. This physical constraint creates a constant, painful trade-off between serving existing users, launching new features, and funding research, making GPU allocation a central strategic challenge.

Sam Altman's vision for OpenAI's business is not complex software licensing but selling intelligence as a fundamental utility. The model is to "sell tokens" on a metered basis, much like a power company sells electricity, aiming to make intelligence abundant and accessible on demand.

Instead of managing compute as a scarce resource, Sam Altman's primary focus has become expanding the total supply. His goal is to create compute abundance, moving from a mindset of internal trade-offs to one where the main challenge is finding new ways to use more power.

Sam Altman clarifies that OpenAI's large losses are a strategic investment in training. The core economic model assumes that revenue growth directly follows the expansion of their compute fleet, stating that if they had double the compute, they would have double the revenue today.

OpenAI's CFO argues that revenue growth has a nearly 1-to-1 correlation with compute expansion. This narrative frames fundraising not as covering losses, but as unlocking capped demand, positioning capital injection as a direct path to predictable revenue growth for investors.

Instead of viewing compute as a cost center, OpenAI treats it as a revenue generator, analogous to hiring salespeople. The core belief is that demand for AI capabilities is so vast that they can never build compute fast enough to satisfy it, justifying massive, forward-looking infrastructure investments.

Greg Brockman demystifies OpenAI's business model as a straightforward process: acquire compute power through renting, building, or buying, and then resell that compute in the form of intelligence at a positive operating margin. Success depends on scalable demand for intelligence, which he views as unlimited.

Sam Altman claims OpenAI is so "compute constrained that it hits the revenue lines so hard." This reframes compute from a simple R&D or operational cost into the primary factor limiting growth across consumer and enterprise. This theory posits a direct correlation between available compute and revenue, justifying enormous spending on infrastructure.

OpenAI's Core Business Model is Reselling Compute at a Margin | RiffOn