The world is rushing to capitalize on the commercial and societal promise of AI. In the northeast of England, the recently announced Teesworks data center project promises to be Europe’s largest data center.
Across the Atlantic, Amazon’s sprawling facilities in Indiana underscore how enterprises and governments are sprinting to build infrastructure for the AI era.
The UK government’s new Compute Roadmap, for example, calls for at least 6GW of AI-ready data center capacity by 2030—triple the current national footprint—to keep pace with the US and other leading markets.
But beneath this breakneck growth, a quieter crisis is emerging. The computational demands of AI tools may be racing ahead, but the infrastructure required to support it—above all, power—is trailing behind. An urgent question must be answered: how can the grid keep up with our desire to scale AI?
Founder and CEO of Finchetto.
AI’s Boom Is Powering Up, But Can the Grid Keep Up?
Projects like Teesworks and Amazon’s Indiana buildout are part of a global rush to shore up data center capacity. Yet this rapid buildout is exposing a fundamental mismatch. Even as AI’s hunger for computing resources grows exponentially, there are high-profile harbingers of the potential bottlenecks introduced when national grids can’t keep up.
In Northern Virginia – the world’s densest cloud hub – new AI and cloud projects have had to be paused due to a lack of electricity. Over in Ireland, data centers now consume more than 20% of national electricity, prompting proposals that they build their own private power lines. The UK, meanwhile, is relaxing planning rules for new transmission towers to speed up grid upgrades.
This isn’t a problem inherent to regional infrastructures – it’s a global phenomenon brought about by putting the AI cart before the energy horse. And with AI’s runaway growth unlikely to slow down anytime soon, the focus must be on finding solutions to reduce energy demands as much as expanding grid capacity.
The Looming Power Surge
The data backs up the anecdotes. According to a Deloitte survey, 72% of US energy and data center executives view power capacity as extremely challenging as a result of widespread AI adoption, and 82% see innovation – not just grid expansion – as the only viable solution.
Bloomberg Intelligence reports that there’s now a 12–24 month gap between when data centers need power and when the grid can deliver it, a delay that is stalling growth in key markets.
The issue is both technical and systemic. Even when renewable energy is available – such as wind power from Scotland – it often cannot reach the data centers that need it most, thanks to constrained transmission infrastructure.
We face a kind of energy Catch-22: the need for more energy is desperate, but the energy we generate cannot always be used where it is required.
The problem is compounded by the fact that conventional data center hardware is not designed to be energy-efficient at the scale now demanded by AI workloads.
Scaling Responsibly Versus Collapsing Under Demand
The solution is not simply to build more data centers and expand the grid accordingly, but to also rethink the fundamentals of computing infrastructure.
The investment gap is threefold: we need more data centers, yes, but also better grid access, accelerated renewable integration, and – critically – a new generation of energy-efficient hardware within the data centers themselves.
Moore’s Law, which drove decades of exponential growth in computing, is reaching its limits. AI demands something more radical.
The industry must look to technologies such as analogue computing, neuromorphic chips, and especially light-based (all-optical) architectures that eschew the costly energy conversions of current electro-optical networks.
These innovations promise not just marginal gains, but step changes in energy efficiency—delivering the performance needed for AI workloads while slashing the electricity required per calculation.
Rethinking Growth: Why More AI Shouldn’t Mean More Megawatts
At the moment, we measure AI progress in benchmarks, parameters, and flops. But this is a flawed metric if we ignore the energy cost of each inference. The industry must now prioritize “watts per task” as much as it does “exaflops”.
This is not just about engineering, but about ethics: as AI becomes central to fields from healthcare to climate science, unchecked growth in energy demand risks both the planet and the public’s trust in AI’s benefits.
Solving the energy challenge is not optional—it is existential for the AI industry. The International Energy Agency (IEA) warns that electricity demand from data centers worldwide is set to more than double by 2030, with AI at the heart of the surge.
Without a shift towards smarter, more efficient infrastructure, we risk both environmental harm and the slowing of AI’s transformative potential.
A Smarter Future Demands Smarter Foundations
Every week, new forecasts predict exponential AI growth and the accompanying environmental strain. The answer is not to slow progress, but to accelerate investment in the technologies that can break the link between AI expansion and energy consumption.
This is a call to action for the entire industry—for tech leaders, policymakers, and researchers to collaborate on global standards for efficiency, to back breakthrough research in energy-efficient hardware, and to ensure that the infrastructure of the future is designed for the demands of the present.
The question is no longer if AI will change the world, but whether we have the world to sustain AI’s rise.
The time to invest in smarter, not just bigger, foundations is now.
We list the best IT infrastructure management services.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Add Comment