The data center operators that make it possible to move, store, and use the data on which today’s digital world depends are no stranger to the disruptive power of new technologies, nor are those that create and operate the wired and wireless networks that tie everything together. The same can be said for the equally important chip, processor and switch makers whose products enable the compute power on which everything else depends.

From the beginning, big innovations and significant computing shifts have brought with them a host of challenges – and yes, in many cases more than a little panic. Less than 15 years ago, more than a few manufacturers for CDMA and GSM networks pondered if the very cellular networks they built could be made adequate in the face of the dramatic increase in data associated with the then new, and now ubiquitous, smartphone.

In much the same way, data centers operators responded to proliferation of over-the-top media, not just from smartphones, but devices of every configuration imaginable, by building the data centers needed to make streaming data viable, all while building out the cloud computing infrastructure innumerable applications used in daily life rely on today. In just 15 years, we went from Blockbuster video stores in thousands of towns, to streaming media over an advanced, powerful and increasingly omnipresent network of powerful data centers and high-speed networks that stretches around the world.

Of course, all of this was made possible by technological advancements that enabled solutions providers and innovators to meet society’s insatiable demand for data. Fundamentals like Moore’s Law, while pressed to the brink, remained true – with companies like Moore’s own Intel developing ever faster and more powerful chips and processors, even as numerous innovations like visualization, flash storage, and LTE emerged to make things work.

Fast forward a decade, and the smart phone that threatened to overwhelm cellular networks has within it more compute power, more storage and more memory than the computers used to send astronauts to the moon and back. So it should come as no surprise that with such success, our collective confidence as technologists grew.

Technology, after all, would self-regulate. Like our response to the smart phone and streaming data, the demands of one innovation would be met and addressed by still more advancements.

But what happens when one innovation or one computing trend is so large, so pervasive and so groundbreaking that we cannot respond effectively to the corresponding infrastructure demands associated with it? What’s more, what if two of these truly transformative trends occur at the same time and Moore’s Law no longer suffices or applies? What if Moore’s Law is dead, as one can argue? More to the point, can we actually innovate ourselves to a point of failure?

AI and the IoT – taking data centers to the brink

Today’s data centers are technological marvels that bare only a passing resemblance to their predecessors. Developments in everything from switches to liquid cooling – and even innovative ways of using the very heat they produce to benefit surrounding communities – abound. It goes without saying that data center operators and those that provide the technologies they rely on have for years risen to the occasion. But that may be changing

Two wildly hyped trends, the proliferation and application of artificial intelligence, and the dramatic growth of the Internet of Things, will and are, overwhelming data centers and the networks that tie them together. The problem is straightforward: the hype around AI and the IoT is real.

Forrester Research predicts that spending on generative AI will increase 36% annually to 2030 while Grand View Research calls for the AI market as a whole to grow at a CAGR of 35.9% from 2025 to 2030. The number of devices that comprise the IoT is simultaneously exploding, with IoT Analytics predicting that there will be 40 billion connected devices – also by 2030 – a dramatic increase over the 16.6 billion devices the firm estimated were in use by the end of 2024.

Both of these trends have two things in common. Each results in radical increases in data volumes and requires a dramatic increase in network capacity and the data centers that use it. With both trends occurring rapidly, we are quickly running out of capacity.

When too much of a good thing is truly too much

It’s well known that the glut of data we are seeing with AI and the IoT is an issue, something made overtly clear as hyperscalers rush to embrace nuclear power in the face of AI’s dramatic power consumption needs. In the most recent past, the idea that Microsoft would play a prominent role in bringing Three Mile Island back into operation, or that Amazon and Google would look to deploy nuclear reactors within data centers, might seem otherworldly. Not today.

Nor is it surprising that U.S. households have, on average, more than 20 connected devices. It is likewise accepted that connectivity and the IoT is crucial in business today, from sensors on machines and parcels, to connected robots, drones and vehicles.

What should be surprising though, is the sheer scale of the challenge each of these trends presents to data center operators, something a little covered event and recent research make clear. Both should serve as a wakeup call to our industry:

  • Former Google CEO Eric Schmidt’s Testimony to Congress: Eric Schmidt did not mince words when describing the need for larger data centers or the power they need, noting that they will shortly require 99% of all of the power generated today. Notably, he also pointed out that 10 gigawatt data centers are in the works, while the average nuclear power plant puts out one gigawatt of power.
  • McKinsey & Company’s Research on Data Center Construction: McKinsey recently estimated that to avoid a deficit in capacity by 2030, “at least twice the data center capacity built since 2000 would have to be built in less than a quarter of the time.” Viewed differently, McKinsey believes that by “2030, data centers are projected to require $6.7 trillion worldwide to keep pace with the demand for compute power.”  That number is more than 20% of the entire projected GDP of the United States this year.

Where do we go from here?

Both Schmidt’s testimony and McKinsey’s research should be a wakeup call for the data center sector. We are at a crossroads, one I believe requires us to acknowledge several points, including:

  • The Capital Costs Associated With New Data Centers are Prohibitive: The capital required to build the data centers we need, and the networks required to connect them, does not exist. We cannot in a five-year period earmark the massive sums required to build the facilities we need, nor do we have the labor and construction capacity to complete them even if such lofty budgets existed.
  • We Don’t Have Enough Time: Read the headlines on any given day and you will see numerous articles on towns and municipalities questioning, and often overtly resisting, the construction of new mega data centers. Even if these were ultimately to move forward, the legal delays alone utterly ensure a deficit in data processing power in the years to come.
  • Faster Chips and Processors Will Not Suffice: Yes, quantum computing and other innovations have promise, but we will not be able to simply create faster or more efficient hardware – at least not in time to address the very real data center deficit now emerging – to solve the problems we now face.

Most importantly, we must be realistic and look at data itself. With AI and the IoT, the time has come to reconsider the very fundamentals of how data is configured, coded, moved, shared, used, stored and secured. Only then will we, like so many before, rise to the occasion and deliver the advancements that innovation requires.

###

Charles Yeomans is the founder and CEO of Atombeam, whose Data-as-Codewords technology fundamentally changes how computers communicate while simultaneously decreasing the size of data by 75% and increasing available bandwidth by 4x or more. You can learn more about Charles and Atombeam on X at @AtomBeamTech.