Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

The current affordability of AI tokens is not sustainable; it's propped up by venture capital funding AI companies operating at a loss. Businesses should treat this as a temporary window for aggressive learning and experimentation before prices inevitably rise to reflect true operational costs.

Related Insights

The massive capital expenditure by hyperscalers on AI will likely create an oversupply of capacity. This will crash prices, creating a golden opportunity for a new generation of companies to build innovative applications on cheap AI, much like Amazon utilized the cheap bandwidth left after the dot-com bust.

Current AI pricing models, which pass on expensive LLM costs to users, are temporary. As LLM costs inevitably collapse and become commoditized, the winning companies will be those who have already evolved their monetization to be based on the value their product delivers.

The current subsidized AI subscription model is unsustainable. The inevitable shift to pay-per-token pricing will expose the true cost of inference. For tasks like coding, where AI can "hallucinate" and burn tokens in loops, this creates unpredictable and potentially exorbitant costs, akin to gambling.

The perceived constraint on AI compute isn't a true supply issue, but a consequence of VC-funded companies pricing their services below cost to fuel growth. This creates artificial demand that masks the true, profitable market size until unit economics are forced.

The narrative of "off the charts" AI demand is misleading. Major AI providers like OpenAI are "burning tens of billions of dollars," indicating they are not charging the true cost for their services. A realistic picture of demand will only emerge once they are forced to price for profitability, which could significantly cool the market.

The cost for a given level of AI capability has decreased by a factor of 100 in just one year. This radical deflation in the price of intelligence requires a complete rethinking of business models and future strategies, as intelligence becomes an abundant, cheap commodity.

Contrary to the idea that technology always gets cheaper, building on AI is less expensive now. The current phase is characterized by abundant venture capital and intense competition among AI tool providers, which subsidizes costs for developers. As the market consolidates, these costs will rise.

The cost of AI, priced in "tokens by the drink," is falling dramatically. All inputs are on a downward cost curve, leading to a hyper-deflationary effect on the price of intelligence. This, in turn, fuels massive demand elasticity as more use cases become economically viable.

AI companies like OpenAI are losing money on their popular subscription plans. The computational cost (inference) to serve a user, especially a power user, often exceeds the subscription fee. This subsidized model is propped up by venture capital and is not sustainable long-term.

The primary short-term risk for the AI sector isn't capital expenditure but the high cost of token generation. For AI applications to become ubiquitous, the unit economics must improve. If running a single query remains prohibitively expensive for businesses, widespread, sustainable adoption will be impossible, threatening the entire investment thesis.