We scan new podcasts and send you the top 5 insights daily.
To put energy use in perspective, the Bitcoin network is estimated to consume between 14 and 19 gigawatts of average power. This dwarfs the current capacity of major AI labs like OpenAI and Anthropic, which are estimated to be around 2 gigawatts, highlighting the immense energy scale of proof-of-work systems.
The standard for measuring large compute deals has shifted from number of GPUs to gigawatts of power. This provides a normalized, apples-to-apples comparison across different chip generations and manufacturers, acknowledging that energy is the primary bottleneck for building AI data centers.
Bitcoin's "proof of work" is criticized for its massive, non-productive energy use. A novel concept is to use AI inference compute as the work itself. This "productive proof of work" would secure a cryptocurrency network while simultaneously generating valuable AI-driven outputs, aligning energy consumption with useful computation.
The massive energy consumption of AI data centers is causing electricity demand to spike for the first time in 70 years, a surge comparable to the widespread adoption of air conditioning. This is forcing tech giants to adopt a "Bring Your Own Power" (BYOP) policy, essentially turning them into energy producers.
Bitcoin miners have inadvertently become a key part of the AI infrastructure boom. Their most valuable asset is not their hardware but their pre-existing, large-scale energy contracts. AI companies need this power, forcing partnerships that make miners a valuable pick-and-shovel play on AI.
AI's energy-intensive nature creates a new, powerful stakeholder demanding cheap power. This diverts negative attention from Bitcoin's energy use and aligns incentives for building robust energy grids that ultimately benefit Bitcoin miners as well.
The limiting factor for large-scale AI compute is no longer physical space but the availability of electrical power. As a result, the industry now sizes and discusses data center capacity and deals in terms of megawatts, reflecting the primary constraint on growth.
The energy demands of modern AI are difficult to contextualize. A one-gigawatt data center uses as much power as a city of nearly one million US households. A five-gigawatt facility requires a 5,000-acre building footprint, excluding any power infrastructure.
A key real-time indicator of crypto's viability is the action of its miners. Many are pivoting to provide power for AI infrastructure, signaling that economic incentives are currently superior in centralized AI. This represents a direct power struggle between the two ecosystems.
The infrastructure demands of AI have caused an exponential increase in data center scale. Two years ago, a 1-megawatt facility was considered a good size. Today, a large AI data center is a 1-gigawatt facility—a 1000-fold increase. This rapid escalation underscores the immense and expensive capital investment required to power AI.
OpenAI's partnership with NVIDIA for 10 gigawatts is just the start. Sam Altman's internal goal is 250 gigawatts by 2033, a staggering $12.5 trillion investment. This reflects a future where AI is a pervasive, energy-intensive utility powering autonomous agents globally.