While AI chips represent the bulk of a data center's cost ($20-25M/MW), the remaining $10 million per megawatt for essentials like powered land, construction, and capital goods is where real bottlenecks lie. This 'picks and shovels' segment faces significant supply shortages and is considered a less speculative investment area with no bubble.

Related Insights

Current M&A activity related to AI isn't targeting AI model creators. Instead, capital is flowing into consolidating the 'picks and shovels' of the AI ecosystem. This includes derivative plays like data centers, semiconductors, software, and even power suppliers, which are seen as more tangible long-term assets.

When power (watts) is the primary constraint for data centers, the total cost of compute becomes secondary. The crucial metric is performance-per-watt. This gives a massive pricing advantage to the most efficient chipmakers, as customers will pay anything for hardware that maximizes output from their limited power budget.

During the dot-com crash, application-layer companies like Pets.com went to zero, while infrastructure providers like Intel and Cisco survived. The lesson for AI investors is to focus on the underlying "picks and shovels"—compute, chips, and data centers—rather than consumer-facing apps that may become obsolete.

Unlike the speculative "dark fiber" buildout of the dot-com bubble, today's AI infrastructure race is driven by real, immediate, and overwhelming demand. The problem isn't a lack of utilization for built capacity; it's a constant struggle to build supply fast enough to meet customer needs.

Bitcoin miners have inadvertently become a key part of the AI infrastructure boom. Their most valuable asset is not their hardware but their pre-existing, large-scale energy contracts. AI companies need this power, forcing partnerships that make miners a valuable pick-and-shovel play on AI.

Before AI delivers long-term deflationary productivity, it requires a massive, inflationary build-out of physical infrastructure. This makes sectors like utilities, pipelines, and energy infrastructure a timely hedge against inflation and a diversifier away from concentrated tech bets.

Instead of relying on hyped benchmarks, the truest measure of the AI industry's progress is the physical build-out of data centers. Tracking permits, power consumption, and satellite imagery reveals the concrete, multi-billion dollar bets being placed, offering a grounded view that challenges both extreme skeptics and believers.

While semiconductor access is a critical choke point, the long-term constraint on U.S. AI dominance is energy. Building massive data centers requires vast, stable power, but the U.S. faces supply chain issues for energy hardware and lacks a unified grid. China, in contrast, is strategically building out its energy infrastructure to support its AI ambitions.

Satya Nadella clarifies that the primary constraint on scaling AI compute is not the availability of GPUs, but the lack of power and physical data center infrastructure ("warm shelves") to install them. This highlights a critical, often overlooked dependency in the AI race: energy and real estate development speed.

The infrastructure demands of AI have caused an exponential increase in data center scale. Two years ago, a 1-megawatt facility was considered a good size. Today, a large AI data center is a 1-gigawatt facility—a 1000-fold increase. This rapid escalation underscores the immense and expensive capital investment required to power AI.