The next wave of data growth will be driven by countless sensors (like cameras) sending video upstream for AI processing. This requires a fundamental shift to symmetrical networks, like fiber, that have robust upstream capacity.
The proliferation of sensors, especially cameras, will generate massive amounts of video data. This data must be uploaded to cloud AI models for processing, making robust upstream bandwidth—not just downstream—the critical new infrastructure bottleneck and a significant opportunity for telecom companies.
Unlike 4G/5G revolutions driven by consumer video, 6G will be defined by its utility for enterprise AI applications. Key advancements will be in managing network performance, reducing latency, and adding security layers crucial for business, rather than just increasing consumer bandwidth.
Today's AI is largely text-based (LLMs). The next phase involves Visual Language Models (VLMs) that interpret and interact with the physical world for robotics and surgery. This transition requires an exponential, 50-1000x increase in compute power, underwriting the long-term AI infrastructure build-out.
Unlike the speculative "dark fiber" buildout of the dot-com bubble, today's AI infrastructure race is driven by real, immediate, and overwhelming demand. The problem isn't a lack of utilization for built capacity; it's a constant struggle to build supply fast enough to meet customer needs.
With past shifts like the internet or mobile, we understood the physical constraints (e.g., modem speeds, battery life). With generative AI, we lack a theoretical understanding of its scaling potential, making it impossible to forecast its ultimate capabilities beyond "vibes-based" guesses from experts.
For years, access to compute was the primary bottleneck in AI development. Now, as public web data is largely exhausted, the limiting factor is access to high-quality, proprietary data from enterprises and human experts. This shifts the focus from building massive infrastructure to forming data partnerships and expertise.
Unlike the dot-com bubble's finite need for fiber optic cables, the demand for AI is infinite because it's about solving an endless stream of problems. This suggests the current infrastructure spending cycle is fundamentally different and more sustainable than previous tech booms.
AI's computational needs are not just from initial training. They compound exponentially due to post-training (reinforcement learning) and inference (multi-step reasoning), creating a much larger demand profile than previously understood and driving a billion-X increase in compute.
The infrastructure demands of AI have caused an exponential increase in data center scale. Two years ago, a 1-megawatt facility was considered a good size. Today, a large AI data center is a 1-gigawatt facility—a 1000-fold increase. This rapid escalation underscores the immense and expensive capital investment required to power AI.
AT&T's CEO reframes the network debate, stating that fiber is the universal backbone. Technologies like 5G and satellite are simply different methods for connecting end-users to this core fiber infrastructure, not true competitors to it.