Phantom data centers: what they are (or aren’t) and why they’re holding back the true promise of AI

Rate this post

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn more


C age of AIutilities now face a new, unexpected problem: phantom data centers. On the surface, it might seem absurd: Why (and how) would someone invent something as complex as a data center? But as demand for AI skyrockets along with the need for more computing power, speculation surrounding data center development is wreaking havoc, especially in areas like Northern Virginia, the data center capital of the world. In this evolving landscape, utilities are bombarded with requests for power from real estate developers who may or may not in fact build the infrastructure they claim.

Dummy data centers represent an urgent bottleneck in scaling data infrastructure to handle computing demand. This emerging phenomenon prevents capital from flowing where it should. Any enterprise that can help solve this problem – maybe using AI to solve a problem created by AI—will have a significant advantage.

The gigawatt mirage requires

Dominion Energy, Northern Virginia’s largest utility company, received summary requests for 50 gigawatts of power from data center projects. That’s more energy than Iceland consumes in a year.

But many of these claims are either speculative or completely false. Developers look at potential sites and stake their claims for energy capacity long before they have capital or any strategy for how to break ground. In fact, estimates show that up to 90% of these requests are completely bogus.

In the early days of the data center boom, utilities never had to worry about spurious demand. Companies like Amazon, Google and Microsoft—called “hyperscalers” because they run data centers with hundreds of thousands of servers—made clear requests for power, and utilities were simply delivered. But now the frenzy to secure power supply capacity has resulted in an influx of requests from lesser-known developers or speculators with dubious results. Power companies that have traditionally dealt with only a handful of power-hungry customers are suddenly overwhelmed with orders for capacity that would black out their entire grid.

Utilities struggle to separate fact from fiction

The challenge facing utilities isn’t just technical—it’s existential. They are tasked with determining what is real and what is not. And they are not well equipped to deal with it. Historically, utilities have been slow-growing, risk-averse institutions. Now they are being asked to check speculators, many of whom are simply playing the real estate game, hoping to reverse their power allocation once the market heats up.

Utilities have economic development teams, but those teams aren’t used to dealing with dozens of speculative requests at once. It’s akin to a land rush where only a small fraction of claimants actually plan to build anything tangible. The result? paralysis. Utilities are hesitant to allocate power when they don’t know which projects will materialize, delaying the whole development cycle.

A wall of capital

There’s no shortage of capital flowing into the data center space, but that abundance is part of the problem. When capital is readily available, it leads to speculation. In some ways, this is similar to the problem of the better mousetrap: too many players chasing an oversupplied market. This influx of speculators creates indecision not only in utilities but also in local communities, which must decide whether to grant permits for land use and infrastructure development.

Adding to the complexity is that data centers aren’t just for AI. Of course, artificial intelligence is causing a spike in demand, but there is also an ongoing need for cloud computing. Developers are building data centers to accommodate both, but distinguishing between the two is increasingly difficult, especially when projects intermingle AI advertising with traditional cloud infrastructure.

What is real?

Legitimate players—the aforementioned Apple, Google, and Microsoft—are building real data centers, and many are adopting strategies like behind-the-meter deals with renewable energy providers or building microgrids to avoid the hurdles of grid interconnection. But as the real projects proliferate, so do the fake ones. Developers with little experience in the space are scrambling to win, leading to an increasingly chaotic environment for utilities.

The problem isn’t just the financial risk—although the capital required to build a single gigawatt-scale campus can easily exceed several billion dollars—it’s the sheer complexity of developing infrastructure on this scale. A 6 gigawatt campus sounds impressive, but financial and engineering realities make it nearly impossible to build in a reasonable timeframe. Yet speculators are throwing around these huge numbers, hoping to secure energy capacity in hopes of derailing the project later.

Why the network can’t keep up with data center demands

As utilities struggle to separate fact from fiction, the grid itself becomes an obstacle. McKinsey recently estimated that global demand for data centers could reach up to 152 gigawatts by 2030adding 250 terawatt hours of new electricity demand. In the US, only data centers could answer 8% of total energy demand by 2030a staggering figure given how little demand has grown over the past two decades.

Yet the web is not ready for this influx. Interconnection and transmission problems are widespread, with projections suggesting the US could run out of capacity from 2027 to 2029 if alternative solutions are not found. Developers are increasingly turning to on-site generation, such as gas turbines or microgrids, to avoid the hurdles of interconnection, but these outages only serve to highlight the grid’s limitations.

Conclusion: utilities as gatekeepers

The real obstacle isn’t a lack of capital (trust me, there’s a lot of capital here) or even technology—it’s the utilities’ ability to act as gatekeepers, determining who’s real and who’s just playing the speculation game. Without a robust developer vetting process, the web risks being overwhelmed by projects that will never materialize. The era of fake data centers is here, and until utilities adapt, the entire industry may struggle to keep up with real demand.

In this chaotic environment, it is not just about the distribution of power; it’s about utilities learning to navigate a new, speculative frontier so that businesses (and AI) can thrive.

Sophie Bakalar is a partner in Mutual fund.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including technical data people, can share data-related insights and innovations.

If you want to read about cutting-edge ideas and up-to-date information, best practices and the future of data and data technology, join us at DataDecisionMakers.

You might even think contributing an article of your own!

Read more from DataDecisionMakers


 
Report

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *