Revenue figures for AI companies can be misleading. The same dollar is often counted multiple times as it moves from the end customer through a SaaS provider and a cloud platform before reaching the model provider, creating a "margin stacking" effect that obscures the true net revenue.

Related Insights

For a true AI-native product, extremely high margins might indicate it isn't using enough AI, as inference has real costs. Founders should price for adoption, believing model costs will fall, and plan to build strong margins later through sophisticated, usage-based pricing tiers rather than optimizing prematurely.

The compute-heavy nature of AI makes traditional 80%+ SaaS gross margins impossible. Companies should embrace lower margins as proof of user adoption and value delivery. This strategy mirrors the successful on-premise to cloud transition, which ultimately drove massive growth for companies like Microsoft.

Counterintuitively, very high gross margins in a company pitching itself as "AI" can be a warning sign. It may indicate that users aren't engaging with the core, computationally expensive AI features. Lower margins can signal genuine, heavy usage of the core AI product.

Unlike SaaS, where high gross margins are key, an AI company with very high margins likely isn't seeing significant use of its core AI features. Low margins signal that customers are actively using compute-intensive products, a positive early indicator.

Contrary to traditional software evaluation, Andreessen Horowitz now questions AI companies that present high, SaaS-like gross margins. This often indicates a critical flaw: customers are not engaging with the costly, core AI features. Low margins, in this context, can be a positive signal of genuine product usage and value delivery.

Traditional SaaS metrics like 80%+ gross margins are misleading for AI companies. High inference costs lower margins, but if the absolute gross profit per customer is multiples higher than a SaaS equivalent, it's a superior business. The focus should shift from margin percentages to absolute gross profit dollars and multiples.

Large tech firms invest in AI startups who then agree to spend that money on the investor's services. This creates a "circular" flow of cash that boosts the startup's perceived revenue and the tech giant's AI-related sales, creating questionable accounting.

The shift to usage-based pricing for AI tools isn't just a revenue growth strategy. Enterprise vendors are adopting it to offset their own escalating cloud infrastructure costs, which scale directly with customer usage, thereby protecting their profit margins from their own suppliers.

The AI infrastructure boom is a potential house of cards. A single dollar of end-user revenue paid to a company like OpenAI can become $8 of "seeming revenue" as it cascades through the value chain to Microsoft, CoreWeave, and NVIDIA, supporting an unsustainable $100 of equity market value.

Large-sounding enterprise AI adoption metrics, like Google's '150 enterprises processing a trillion tokens,' can translate to surprisingly low revenue—less than $1M per enterprise annually. This suggests headline adoption numbers may not yet reflect significant financial impact for cloud providers.