Phantom data centers: What they are (or aren’t) and why they’re hindering the true promise of AI
Subscribe to our daily and weekly newsletters to receive the latest updates and exclusive content on industry-leading AI reporting. Learn more
In the Age of AIMunicipal utilities are now faced with a new, unexpected problem: phantom data centers. On the surface, it may seem absurd: why (and how) would anyone make something as complex as a data center? But as AI demand skyrockets along with the need for more computing power, speculation about data center development is causing chaos, particularly in areas like Northern Virginia, the data center capital of the world. In this evolving landscape, utilities are being bombarded with power requests from real estate developers, whether that is the case or not Strictly speaking build the infrastructure they claim.
Fake data centers represent a pressing bottleneck in scaling data infrastructure to keep up with computing demand. This emerging phenomenon prevents capital from flowing where it is actually needed. Any company that can help solve this problem – maybe Use AI solving a problem created by AI – will have a significant advantage.
The mirage of gigawatt requirements
Dominion Energy, Northern Virginia’s largest utility, has received aggregate requests 50 gigawatts of electricity from data center projects. That’s more electricity than Iceland uses in a year.
But many of these inquiries are either speculative or outright false. Developers have been eyeing potential sites and staking their claims on power capacity long before they have the capital or a strategy to break ground. In fact, estimates suggest that up to 90% of these requests are completely fake.
In the early days of the data center boom, utilities didn’t have to worry about fake demand. Companies like Amazon, Google and Microsoft—also called “hyperscalers” because they operate data centers with hundreds of thousands of servers—made simple power requests and the utilities simply supplied them. But now the rush to secure power capacity has led to an influx of inquiries from lesser-known developers or speculators with dubious track records. Utilities that traditionally only serve a handful of power-hungry customers are suddenly inundated with orders for power capacity that would dwarf their entire grid.
Utilities have difficulty separating fact from fiction
The challenge for utilities is not only technical, but also existential. Your job is to determine what is real and what is not. And they are not well equipped for it. Historically, utilities have been slow, risk-averse institutions. Now they are being asked to check speculators, many of whom are simply playing the real estate game and hoping to flip their power allocations once the market heats up.
Utilities have groups dedicated to economic development, but those teams aren’t used to fielding dozens of speculative requests at once. It’s akin to a land rush in which only a fraction of those claiming shares actually plan to build anything tangible. The result? Paralysis. Utilities are reluctant to distribute electricity if they don’t know which projects will be implemented, slowing down overall development Development cycle.
A wall of capital
There is no shortage of capital flowing into the data center space, but this abundance is part of the problem. When capital is easily accessible, it leads to speculation. In some ways, this is similar to the better mousetrap problem: too many players chasing an oversupplied market. This influx of speculators is causing indecision not only among utility companies, but also among local communities that must decide whether to grant permits for land use and infrastructure development.
To make matters worse, data centers aren’t just for AI. Sure, AI is driving demand, but there is also a continued need for cloud computing. Developers are building data centers to enable both, but distinguishing between the two is becoming increasingly difficult, especially as projects merge AI hype with traditional cloud infrastructure.
What is real?
The legitimate players – the Apples, Googles and Microsofts mentioned above – are building real data centers, and many are pursuing strategies such as behind-the-meter contracts with renewable energy providers or building microgrids to avoid grid connection bottlenecks. But as real projects become more widespread, fake projects also increase. Developers with little experience in the field are trying to make money, creating an increasingly chaotic environment for utilities.
The problem is not just the financial risk – although the capital required to build a single gigawatt campus can easily exceed billions of dollars – but also the sheer complexity of developing infrastructure of this scale. A 6 gigawatt campus sounds impressive, but financial and technical realities make it almost impossible to build in a reasonable time frame. Still, speculators are throwing up these huge numbers in hopes of securing power capacity and turning the project around later.
Why the network can’t keep up with the demands of the data center
As utilities struggle to separate fact from fiction, the power grid itself is becoming a bottleneck. McKinsey recently estimated that global demand for data centers could reach up to 152 gigawatts by 2030which means an additional electricity requirement of 250 terawatt hours. In the USA, data centers alone could be responsible for this 8% of total electricity demand by 2030an astonishing number considering how little demand has increased over the past two decades.
However, the power grid is not ready for this influx. Connection and transmission problems are common. It is estimated that US electricity capacity could be exhausted between 2027 and 2029 if alternative solutions are not found. Developers are increasingly turning to on-site generation such as gas turbines or microgrids to avoid interconnection bottlenecks. However, these stopgap solutions only illustrate the limits of the network.
Conclusion: utilities as gatekeepers
The real bottleneck isn’t a lack of capital (believe me, there’s plenty of capital here) or even technology – it’s the ability of utilities to act as gatekeepers, determining who’s real and who’s just the speculation game plays. Without a robust developer vetting process, there is a risk that the web will be flooded with projects that never materialize. The age of fake data centers is here, and until utilities adapt, the entire industry could struggle to keep up with real demand.
In this chaotic environment, it’s not just about the distribution of power; It’s about utilities learning to navigate new, speculative frontiers so businesses (and AI) can thrive.
Sophie Bakalar is a partner at Community Fund.
DataDecisionMakers
Welcome to the VentureBeat community!
At DataDecisionMakers, experts, including engineers who work with data, can share data-related insights and innovations.
If you want to learn more about innovative ideas and current information, best practices and the future of data and data technology, visit us at DataDecisionMakers.
You might even think about it contribute an article Your own!
Source link