We scan new podcasts and send you the top 5 insights daily.
Despite the rapid pace of hardware innovation, the value of older NVIDIA GPUs like the H100 is holding strong. Cloud provider CoreWeave reports these chips are retaining 90-95% of their pricing power over a 5-6 year lifespan because compute demand far outstrips supply.
CoreWeave dismisses speculative analyst reports on GPU depreciation. Their metric for an asset's true value is the willingness of sophisticated buyers (hyperscalers, AI labs) to sign multi-year contracts for it. This real-world commitment is a more reliable indicator of long-term economic utility than any external model.
Unlike typical computer hardware that depreciates rapidly, H100 GPUs are trading above their launch price in secondary markets. This market anomaly, driven by the extreme and sustained compute shortage for AI, completely inverts traditional financial models for hardware assets.
The current AI moment is unique because demand outstrips supply so dramatically that even previous-generation chips and models remain valuable. They are perfectly suited for running smaller models for simpler, high-volume applications like voice transcription, creating a broad-based boom across the entire hardware and model stack.
According to CoreWeave's CEO, a GPU becomes obsolete not when a new chip is released, but when the power and space it consumes could be used for a higher-margin, newer chip. The decision is purely economic, based on the opportunity cost of electricity, not the hardware's technical viability.
The sustainability of the AI infrastructure boom is debated. One view is that GPUs depreciate rapidly in five years, making current spending speculative. The counterargument is that older chips will have a long, valuable life serving less complex models, akin to mainframes, making them a more durable capital investment.
Contrary to typical hardware depreciation, GPUs like NVIDIA's H100 are becoming more valuable over time. This is because newer, more efficient AI models can generate significantly more output and value on the same hardware, tying the GPU's worth to its utility rather than its age.
Contrary to the belief that AI chips quickly become obsolete, CoreWeave's CEO argues their value holds, citing average five-year client contracts as proof. Older chips like the A100 have even appreciated in price as new use cases emerge, making rapid depreciation a myth.
Hyperscalers are extending depreciation schedules for AI hardware. While this may look like "cooking the books" to inflate earnings, it's justified by the reality that even 7-8 year old TPUs and GPUs are still running at 100% utilization for less complex AI tasks, making them valuable for longer and validating the accounting change.
Countering the narrative of rapid burnout, CoreWeave cites historical data showing a nearly 10-year service life for older NVIDIA GPUs (K80) in major clouds. Older chips remain valuable for less intensive tasks, creating a tiered system where new chips handle frontier models and older ones serve established workloads.
Accusations that hyperscalers "cook the books" by extending GPU depreciation misunderstand hardware lifecycles. Older chips remain at full utilization for less demanding tasks. High operational costs (power, cooling) provide a natural economic incentive to retire genuinely unprofitable hardware, invalidating claims of artificial earnings boosts.