NVIDIA’s business model relies on planned obsolescence. Its AI chips become obsolete every 2-3 years as new versions are released, forcing Big Tech customers into a constant, multi-billion dollar upgrade cycle for what are effectively "perishable" assets.
The strongest evidence that corporate AI spending is generating real ROI is that major tech companies are not just re-ordering NVIDIA's chips, but accelerating those orders quarter over quarter. This sustained, growing demand from repeat customers validates the AI trend as a durable boom.
Despite bubble fears, Nvidia’s record earnings signal a virtuous cycle. The real long-term growth is not just from model training but from the coming explosion in inference demand required for AI agents, robotics, and multimodal AI integrated into every device and application.
The real long-term threat to NVIDIA's dominance may not be a known competitor but a black swan: Huawei. Leveraging non-public lithography and massive state investment, Huawei could surprise the market within 2-3 years by producing high-volume, low-cost, specialized AI chips, fundamentally altering the competitive landscape.
Hyperscalers are extending depreciation schedules for AI hardware. While this may look like "cooking the books" to inflate earnings, it's justified by the reality that even 7-8 year old TPUs and GPUs are still running at 100% utilization for less complex AI tasks, making them valuable for longer and validating the accounting change.
The debate over AI chip depreciation highlights a flaw in traditional accounting. GAAP was designed for physical assets with predictable lifecycles, not for digital infrastructure like GPUs whose value creation is dynamic. This mismatch leads to accusations of financial manipulation where firms are simply following outdated rules.
NVIDIA's primary business risk isn't competition, but extreme customer concentration. Its top 4-5 customers represent ~80% of revenue. Each has a multi-billion dollar incentive to develop their own chips to reclaim NVIDIA's high gross margins, a threat most businesses don't face.
Traditional SaaS metrics like 80%+ gross margins are misleading for AI companies. High inference costs lower margins, but if the absolute gross profit per customer is multiples higher than a SaaS equivalent, it's a superior business. The focus should shift from margin percentages to absolute gross profit dollars and multiples.
Accusations that hyperscalers "cook the books" by extending GPU depreciation misunderstand hardware lifecycles. Older chips remain at full utilization for less demanding tasks. High operational costs (power, cooling) provide a natural economic incentive to retire genuinely unprofitable hardware, invalidating claims of artificial earnings boosts.
The AI infrastructure boom is a potential house of cards. A single dollar of end-user revenue paid to a company like OpenAI can become $8 of "seeming revenue" as it cascades through the value chain to Microsoft, CoreWeave, and NVIDIA, supporting an unsustainable $100 of equity market value.
The narrative of endless demand for NVIDIA's high-end GPUs is flawed. It will be cracked by two forces: the shift of AI inference to on-device flash memory, reducing cloud reliance, and Google's ability to give away its increasingly powerful Gemini AI for free, undercutting the revenue models that fuel GPU demand.