Jensen Huang argues the "AI bubble" framing is too narrow. The real trend is a permanent shift from general-purpose to accelerated computing, driven by the end of Moore's Law. This shift powers not just chatbots, but multi-billion dollar AI applications in automotive, digital biology, and financial services.
The strongest evidence that corporate AI spending is generating real ROI is that major tech companies are not just re-ordering NVIDIA's chips, but accelerating those orders quarter over quarter. This sustained, growing demand from repeat customers validates the AI trend as a durable boom.
Jensen Huang criticizes the focus on a monolithic "God AI," calling it an unhelpful sci-fi narrative. He argues this distracts from the immediate and practical need to build diverse, specialized AIs for specific domains like biology, finance, and physics, which have unique problems to solve.
An NVIDIA director highlights a significant, under-the-radar growth vector: accelerating traditional enterprise software. Oracle's decision to run its classic database on GPUs represents a trillion-dollar infrastructure shift from CPUs to GPUs for core business applications, proving NVIDIA's market extends far beyond the current AI boom.
Despite bubble fears, Nvidia’s record earnings signal a virtuous cycle. The real long-term growth is not just from model training but from the coming explosion in inference demand required for AI agents, robotics, and multimodal AI integrated into every device and application.
Jensen Huang forecasts that the next major AI breakthrough will be in digital biology. He believes advances in multimodality, long context models, and synthetic data will converge to create a "ChatGPT moment," enabling the generation of novel proteins and chemicals.
The debate on whether AI can reach $1T in revenue is misguided; it's already reality. Core services from hyperscalers like TikTok, Meta, and Google have recently shifted from CPUs to AI on GPUs. Their entire revenue base is now AI-driven, meaning future growth is purely incremental.
Beyond selling chips, NVIDIA strategically directs the industry's focus. By providing tools, open-source models, and setting the narrative around areas like LLMs and now "physical AI" (robotics, autonomous vehicles), it essentially chooses which technology sectors will receive massive investment and development attention.
Critics like Michael Burry argue current AI investment far outpaces 'true end demand.' However, the bull case, supported by NVIDIA's earnings, is that this isn't a speculative bubble but the foundational stage of the largest infrastructure buildout in decades, with capital expenditures already contractually locked in.
AI's computational needs are not just from initial training. They compound exponentially due to post-training (reinforcement learning) and inference (multi-step reasoning), creating a much larger demand profile than previously understood and driving a billion-X increase in compute.
Countering the narrative of insurmountable training costs, Jensen Huang argues that architectural, algorithmic, and computing stack innovations are driving down AI costs far faster than Moore's Law. He predicts a billion-fold cost reduction for token generation within a decade.