A Vision for the Next Era of Compute from Structure Research’s Jabez Tan
Framing the Future of AI Infrastructure
At the infra/STRUCTURE Summit 2025, held October 15–16 at the Wynn Las Vegas, Jabez Tan, Head of Research at Structure Research, opened the event with a forward-looking keynote titled “Where Is AI Taking Data Centers?” His presentation provided a data-driven perspective on how artificial intelligence (AI) is reshaping digital infrastructure, redefining scale, design, and economics across the global data center ecosystem.
Tan’s session served as both a retrospective on how far the industry has come and a roadmap for where it’s heading. With AI accelerating demand beyond traditional cloud models, his insights set the tone for two days of deep discussion among the sector’s leading operators, investors, and technology providers.
From the Edge to the Core – A Redefinition of Scale
Tan began by looking back just a few years to what he called “the 2022 era of edge obsession.” At that time, much of the industry believed the future of cloud would depend on thousands of small, distributed edge data centers. “We thought the next iteration of cloud would be hundreds of sites at the base of cell towers,” Tan recalled. “But that didn’t really happen.”
Instead, the reality has inverted. “The edge has become the new core,” he said. “Rather than hundreds of small facilities, we’re now building gigawatts of capacity in centralized regions where power and land are available.”
That pivot, Tan emphasized, is fundamentally tied to economics, where cost, energy, and accessibility converge. It reflects how hyperscalers and AI developers are chasing efficiency and scale over proximity, redefining where and how the industry grows.
The AI Acceleration – Demand Without Precedent
Tan then unpacked the explosive demand for compute since late 2022, when AI adoption began its steep ascent following the launch of ChatGPT. He described the industry’s trajectory as a “roller coaster” marked by alternating waves of panic and optimism—but one with undeniable momentum.
The numbers he shared were striking. NVIDIA’s GPU shipments, for instance, have skyrocketed: from 1.3 million H100 Hopper GPUs in 2024 to 3.6 million Blackwell GPUs sold in just the first three months of 2025, a threefold increase in supply and demand. “That translates to an increase from under one gigawatt of GPU-driven demand to over four gigawatts in a single year,” Tan noted.
Tan linked this trend to a broader shift: “AI isn’t just consuming capacity, it’s generating revenue.” Large language model (LLM) providers like OpenAI, Anthropic, and xAI are now producing billions in annual income directly tied to compute access, signaling a business model where infrastructure equals monetization.
Measuring in Compute, Not Megawatts
One of the most notable insights from Tan’s session was his argument that power is no longer the most accurate measure of data center capacity. “Historically, we measured in square footage, then in megawatts,” he said. “But with AI, the true metric is compute, the amount of processing power per facility.”
This evolution is forcing analysts and operators alike to rethink capacity modeling and investment forecasting. Structure Research, Tan explained, is now tracking data centers by compute density, a more precise reflection of AI-era workloads. “The way we define market share and value creation will increasingly depend on how much compute each facility delivers,” he said.
From Training to Inference – The Next Compute Shift
Tan projected that as AI matures, the balance between training and inference workloads will shift dramatically. “Today, roughly 60% of demand is tied to training,” he explained. “Within five years, 80% will be inference.”
That shift will reshape infrastructure needs, pushing more compute toward distributed yet interconnected environments optimized for real-time processing. Tan described a future where inference happens continuously across global networks, increasing utilization, efficiency, and energy demands simultaneously.
The Coming Capacity Crunch
Perhaps the most sobering takeaway from Tan’s talk was his projection of a looming data center capacity shortfall. Based on Structure Research’s modeling, global AI-related demand could grow from 13 gigawatts in 2025 to more than 120 gigawatts by 2030, far outpacing current build rates.
“If development doesn’t accelerate, we could face a 100-gigawatt gap by the end of the decade,” Tan cautioned. He noted that 81% of capacity under development in the U.S. today comes from credible, established providers, but even that won’t be enough to meet demand. “The solution,” he said, “requires the entire ecosystem, utilities, regulators, financiers, and developers to work in sync.”
Fungibility, Flexibility, and the AI Architecture of the Future
Tan also emphasized that AI architecture must become fungible, able to handle both inference and training workloads interchangeably. He explained how hyperscalers are now demanding that facilities support variable cooling and compute configurations, often shifting between air and liquid systems based on real-time needs.
“This isn’t just about designing for GPUs,” he said. “It’s about designing for fluidity, so workloads can move and scale without constraint.”
Tan illustrated this with real-world examples of AI inference deployments requiring hundreds of cross-connects for data exchange and instant access to multiple cloud platforms. “Operators are realizing that connectivity, not just capacity, is the new value driver,” he said.
Agentic AI – A Telescope for the Mind
To close, Tan explored the concept of agentic AI, systems that not only process human inputs but act autonomously across interconnected platforms. He compared its potential to the invention of the telescope.
“When Galileo introduced the telescope, it challenged humanity’s view of its place in the universe,” Tan said. “Large language models are doing something similar for intelligence. They make us feel small today, but they also open an entirely new frontier for discovery.”
He concluded with a powerful metaphor: “If traditional technologies were tools humans used, AI is the first technology that uses tools itself. It’s a telescope for the mind.”
A Market Transformed by Compute
Tan’s session underscored that AI is redefining not only how data centers are built but also how they are measured, financed, and valued. The industry is entering an era where compute density is the new currency, where inference will dominate workloads, and where collaboration across the entire ecosystem is essential to keep pace with demand.
Infra/STRUCTURE 2026: Save the Date
Want to tune in live, receive all presentations, gain access to C-level executives, investors and industry leading research? Then save the date for infra/STRUCTURE 2026 set for October 7-8, 2026 at The Wynn Las Vegas. Pre-Registration for the 2026 event is now open, and you can visit www.infrastructuresummit.io to learn more.