Part one of our Keep Calm and Count the Kilowatts series showed how AI prompts are only a small portion of a person’s daily power use. Part two explored how AI’s power, water and carbon footprints stack up on a global scale.
But the real environmental impact here isn’t the tiny sip of power your individual prompt uses; it’s the massive, concentrated impact new data centers can have on the specific towns and ecosystems they’re built in.
Disproportionate effects
AI-specific data centers leave a bigger, messier footprint than other types of data centers, and are large industrial facilities that impact power grids, water supply and air quality.
There are two main problems. Power density – running AI often means cramming loads of high wattage GPUs into a small area, and the resulting data center can use a lot more power than one of the same size that’s just streaming Netflix.
Operators are already discovering that this kind of concentrated demand forces them to rewire substations and delay new builds because AI-driven data center expansions do not always have the power to fully operate.
In fact, companies are having to reshape power delivery and cooling around AI workloads, and AI is leading to higher data center emissions.
Secondly, a data center that focuses on training AI models rather than serving content to users can be built where other data centers can’t, and these areas are often less equipped to handle the impact.
Impacts like air pollution from portable gas turbines used when an AI data center was built in an area where the grid could only supply 4% of its power needs. Sites like this can end up importing diesel, burning gas on-site and competing with local residents for already-stretched infrastructure.
And while data centers are not large users of water compared to other industries, they are often built in areas where they can have major impacts on the small amount of resources available.
It’s easy to blame AI alone for these impacts, but the underlying problem is the lax (and many would say corrupt) regulations and laws (not to mention the politicians in charge) that make it cheaper for companies to harm the environment than work toward sustainability.
In fact, AI (and data centers in general) don’t have to use any water for cooling and can be carbon neutral – it just costs more, reducing profits.
Can data centers go green?
Right now it’s a race to build out new GPU farms wherever possible, which is predicted to triple local energy demand by 2035. Avoiding negative impacts from this increase is not an unknown, or even hard – it’s well-studied engineering.
Key are things like better grid planning so power hungry data centers don’t outgrow their local supply, options like dense but efficient water cooled racks that waste less energy as heat, and regulations / incentives that mean it’s more profitable for companies to use renewable energy and rely less on local resources like water.
While not yet enough, this greener approach is already happening. Google has what feels like an on and off again relationship with its don’t be evil mantra, but the company sources about 66% of its electricity from renewable sources, and it tops that up to 100% with offsets. Google is also experimenting with campuses that sit right next to wind and solar farms.
But right now, these greener approaches are (mostly) not being done out of the goodness of a company’s heart – it’s because if the grid can’t keep up with AI’s surging energy demands, future profit might be lost.
And just because they try something, doesn’t mean they will stick with it – Microsoft canned its Project Natick underwater data center tests despite it being a success.
The missing step is still government regulations and incentives. Done properly, it’s entirely reasonable to balance data center growth with environmental responsibility and avoid any negative impacts to the local area.
And despite political opposition, renewable production continues to ramp up delightfully fast and is expected to be more than enough to meet new demand (including from AI) over the rest of the decade.
Data centers can also help out the local area, and waste heat can be a valuable community asset for heating homes and even greenhouses.
What’s next?
A sustainable AI future is also about using the tech to shrink emissions faster than it grows them. That could include doing energy intensive model training in areas where green energy is plentiful, and then putting AI to work in ways that can help amplify and improve existing efforts to reduce environmental impact.
It also means having more conversations about the real impacts of data centers: right now AI companies rarely talk about their energy usage in detail even as data centers quietly become a much bigger slice of global emissions and big players like Google use more power every year – and not just for AI.
It’s not as simple as spending more money on higher tech solutions, and balancing cost and climate impact reduction is an important and nuanced consideration in the AI era of data infrastructure.
Still, AI data centers can be built in areas and ways that support the local community – but only when there are also suitable regulations and infrastructure upgrades.
And yes, AI can have a lot of problematic impacts as a technology, but it’s also just the sudden new growth that exposed the existing, flawed regulations and energy systems. But acknowledging and discussing these underlying problems means we can better focus on creating a genuinely sustainable AI future.
The takeaway
The core fact is that one AI prompt (or even hundreds) is only a tiny fraction of most people’s daily use, and small compared to luxuries like TV, gaming and even your Christmas lights.
On the global scale, AI power use is significant enough to pay attention to, but it’s still only a minor part of the collective race to see whether technology will save us from ourselves, or just provide a more entertaining apocalypse.
Of course, you don’t have to just sit and wait to see how it all plays out. Taking matters into your own hands and offsetting the CO2 emissions from your AI use is a rounding error on the already surprisingly low cost of going carbon neutral.
In fact, offsetting all my personal carbon emissions for a year starts from about the same cost as a ChatGPT Plus subscription.
So keep calm, count the kilowatts, and focus on where the big gains lie – just remembering to turn off the bathroom light before bed buys you 250 guilt-free AI prompts.
Don’t get me wrong, AI is mired in issues and controlled by problematic people and companies, but the doom and gloom isn’t because of the electricity use. Mostly.
Not convinced that AI can go green? Let me know what you think is a better plan in the comments!
AI skeptics might also like
AI enthusiasts might also like
How we use AI
Here at TechRadar, our coverage is author-driven. AI helps with searching sources, research, fact-checking, plus spelling and grammar suggestions. A human still checks every figure, source and word before anything goes live. Occasionally we use it for important work like adding dinosaurs to colleagues’ photos. For the full rundown, see our Future and AI page.











Add Comment