At the DCD Investment Forum colocated with DCD>Connect New York March 24 and 25, Ilissa Miller, Founder and CEO of iMiller Public Relations moderated the keynote panel titled “The Disruption of AI in Data Centers”. The fireside chat style discussion was joined by two powerhouses in digital infrastructure: Dan Golding, CTO of Appleby Strategy Group, and Andrej Danis, Infrastructure Lead at Alix Partners. What unfolded was an unfiltered, high-impact conversation designed to help data center investors separate hype from hard truth.
Don’t worry! If you were not in the standing room only session, here’s what you — and every digital infrastructure investor — should take away from the discussion:
Agentic AI Is Coming — And It Changes Everything
We’re all familiar with generative AI like ChatGPT, but as Dan Golding emphasized, “That’s a demo. The AI that’s coming is agentic AI — autonomous assistants for everyone in your company.” These agents won’t just answer questions — they’ll schedule meetings, manage expenses, and negotiate on your behalf.
Andrej Danis added a critical business lens: “The transformation required to adopt something as simple as an AI assistant is still underestimated. Operational readiness will be key to real adoption.”
In a nutshell, we’re just in the sandbox kids! The real AI has yet to be realized (and will continue to have that same sentiment as new AI agents are introduced).
Technology Roadmaps Are a Moving Target
At NVIDIA’s GTC event held in San Jose, CA the week of March 17, 2025, Jensen Huang – CEO of NVIDIA – shared his vision of a $1 trillion computing inflection point by 2028 — but the pace of change is staggering.
Golding tempered this point with a reality check: “Last year, they said Blackwell would ship immediately — that didn’t happen. They’re being shipped now, but many aren’t working due to thermal issues. You can add one or two years to every date Jensen gave.”
This means that astute investors must be discerning about information coming at them and assess what is realistic to achieve. While chips may be manufactured – are they ready? Is the market ready? Can the infrastructure support and sustain them? Time will tell and infrastructure investors have to have the stomach to take risk – and be in it for the long haul.
Depreciation and ROI: Prepare for Compression
If you’re still modeling 7-year GPU depreciation, you might be in trouble. “Depreciation is the most important thing in the AI business,” Golding warned. “GPUs need 90% utilization. They must be working at 3 a.m., not just training, but performing live inference.”
That creates a need for orchestration software — and not just any software. As Golding put it, “This is a software game as well as a hardware game.”
‘AI Factories’ the new term for GPU-as-a-service colocation and data center providers are grappling with the installation of the Blackwell chips. While new iterations and the promise of a 600kW rack was disclosed at the GTC event. This poses a challenge for companies that are investing today in GPUs. Jensen’s roadmap has new processors being introduced sooner (rather than later), shifting the depreciation cycles. Investors are encouraged to challenge the idea of long depreciation cycles (and temper that with utilization goals).
CapEx Just Went Stratospheric
During the discussion, the question was asked directly: how much more capital-intensive is GPU infrastructure?
Golding didn’t hesitate: “It’s about 20 times more expensive than a CPU-based machine — and the depreciation window is shorter.”
This changes everything from design to financial modeling. Danis pointed out, “Many companies today are cash flow negative — some may go bust, but the industry will thrive. This is the new gold rush, and you can bet on the right thing… or the wrong one.”
So the challenge to investors is how do you stay positive in this challenging climate? Going for the long-game is the short answer. Investors who want a quick return or a return in under a decade may be hard pressed – especially with the growing capital intensive costs for infrastructure upgrades (think liquid cooling); re-designs; new builds required and the astronomical cost for the GPU chips alone!
This Is No Longer a Real Estate Play
During the session, Miller made the point clearly: “This is not a real estate play anymore — this is a technology play. You need to understand the technology in order to build infrastructure that supports what’s coming.”
Golding echoed that strongly repeating with an industry CTO said to him at GTC: “If you don’t understand the workload, you’re not in this business.”
Understanding workloads is now essential — it affects power, cooling, networking, and software decisions. And investors in the space may be grappling to find the experts that understand the complexities of this technology. Aren’t we all? The innovations and opportunities are being introduced at a staggering pace – experienced engineers and those familiar with actual deployments and the challenges that are happening are the ones to be in touch with. Frontline experience is key to ensuring ongoing success. Traditional financial modeling needs to be re-examined from a multitude of angles, this poses a challenge for those that may want to consider their investments a simple real estate translation (land, power, connectivity). It’s more about what are you building data centers for?
Expect New Pricing Models to Emerge
AI software models are already charging by the token. That could disrupt how services are priced and consumed. Danis shared: “For the investment community, you can think of a token as output. It’s not just about hours anymore — it’s what you get from those hours.”
Golding added, “Right now, GPU clouds are billed per hour — shifting to token pricing will be tough, but what’s really changed is the network. Requirements are up 30x over CPU workloads. That’s triggered new fiber investments we haven’t seen in 25 years.”
How could this different type of pricing affect the financial models focused on power? We have yet to realize this shift from a topline perspective, though, it’s already happening at the user perspective. For example, Miller shared that companies providing AI tools for development already charge for tokens. As end-users get used to that model, the translation to power will be important to convey. Imagine a future where the question is asked: “If I support a trillion tokens how does that translate to how much power I need?”
Long-Term Hype vs. Near-Term Reality
From 2-gigawatt campus announcements to AI moonshots, the panel pushed back on some of the over-enthusiasm in the market.
Golding delivered a sober reality check: “There are no two-gigawatt data centers. What really exists is a 300 to 500 MW site that might grow over 10–20 years. These big numbers are planning exercises, not build-outs.”
While bigger may seem better, the reality is that the power infrastructure needs to be there to support these builds, and the U.S. just doesn’t have that yet. The planning cycles for these massive developments are 10-20 years out – and that means investors must consider their appetite for the long-term versus the near-term.
Investing Here Requires Bravery and Patience
Golding said it best: “You need intestinal fortitude to invest in this sector. Calm down. Invest — or just observe — but don’t freak out every time a hyperscaler misses a release date.”
As Danis put it: “This is not as trivial as data center investment was once perceived to be.”
Final Thoughts
Miller wrapped up the session by reiterating that: “We need capital investment to upgrade and build the next generation of infrastructure. But to invest well, you need to understand what’s happening inside those data centers — not just around them.”
The takeaway? Get close to the technology. Ask hard questions. Partner with experienced teams. And be ready to adapt. If you’re not building for tomorrow’s workloads, you’re building the wrong thing.