AI's justification for massive energy and capital consumption is weakening as its public-facing applications pivot from world-changing goals to trivial uses like designing vacations or creating anime-style images. This makes the high societal costs of data centers and electricity usage harder for the public to accept.

Related Insights

The massive investment in AI isn't justified by displacing illustrators, whose total wages are negligible. The economic model is predicated on replacing high-cost professions like radiologists or software engineers, which is a far more challenging task.

Despite staggering announcements for new AI data centers, a primary limiting factor will be the availability of electrical power. The current growth curve of the power infrastructure cannot support all the announced plans, creating a physical bottleneck that will likely lead to project failures and investment "carnage."

For years, the tech industry criticized Bitcoin's energy use. Now, the massive energy needs of AI training have forced Silicon Valley to prioritize energy abundance over purely "green" initiatives. Companies like Meta are building huge natural gas-powered data centers, a major ideological shift.

Instead of relying on hyped benchmarks, the truest measure of the AI industry's progress is the physical build-out of data centers. Tracking permits, power consumption, and satellite imagery reveals the concrete, multi-billion dollar bets being placed, offering a grounded view that challenges both extreme skeptics and believers.

Products like Sora and current LLMs are not yet sustainable businesses. They function as temporary narratives, or "shims," to attract immense capital for building compute infrastructure. This high-risk game bets on a religious belief in a future breakthrough, not on the viability of current products.

Despite the hype, AI's impact on daily life remains minimal because most consumer apps haven't changed. The true societal shift will occur when new, AI-native applications are built from the ground up, much like the iPhone enabled a new class of apps, rather than just bolting AI features onto old frameworks.

Unlike other tech rollouts, the AI industry's public narrative has been dominated by vague warnings of disruption rather than clear, tangible benefits for the average person. This communication failure is a key driver of widespread anxiety and opposition.

The infrastructure demands of AI have caused an exponential increase in data center scale. Two years ago, a 1-megawatt facility was considered a good size. Today, a large AI data center is a 1-gigawatt facility—a 1000-fold increase. This rapid escalation underscores the immense and expensive capital investment required to power AI.

Microsoft's plan to train 20 million AI users in India actively fuels exponential demand for energy-intensive computing. This creates a fundamental long-term conflict with its commitment to build fully sustainable data centers. The strategy's success hinges on whether efficiency can outpace this deliberately engineered demand growth.

The biggest risk to the massive AI compute buildout isn't that scaling laws will break, but that consumers will be satisfied with a "115 IQ" AI running for free on their devices. If edge AI is sufficient for most tasks, it undermines the economic model for ever-larger, centralized "God models" in the cloud.

AI's Pitch Shifted from Curing Cancer to Mundane Consumer Apps | RiffOn