The rapid growth of AI is driving a significant increase in global energy demand. The technology simply requires far more power than traditional computation. As companies shift from CPUs to energy-intensive GPUs for AI development, the need for massive upgrades to data center infrastructure, including energy management and cooling, is growing. This poses challenges for meeting climate goals; there are no clear solutions yet on how to sustainably power the AI revolution.
AI Leadership – Fiona Passantino, 30 JAN 2025
Inference needs power
With the rapid rise of AI models for both consumers and systems on a global scale comes a sudden, insatiable demand for energy. AI algorithms gobble energy to train, to run inference and to generate output. The people and time needed to train and refine models, the data centers and sheer compute dwarfs the needs our previous technologies by a large margin.
It isn’t all just about AI information processing; this technology relies on thousands of specialized computer chips that need upgraded data centers that are cool, stable and able to weather surges. The hardware behind this technology consumes more energy and generates more heat than traditional processes. These, in turn, need additional cooling, which means even more energy. A wide-scale, global switch from CPUs to GPUs would be the first order of business, as well as a massive upgrade in energy management, efficiency, security and cooling for a switch of that kind.

CPU vs GPU
Much of this due to a different kind of processing chip called a GPU (Graphic Processing Unit) rather than a CPU (Central Processing Unit). GPUs are about four times more power dense, require more energy and give off more heat[ii]. They were originally developed to run high-end, graphic-rich video gaming environments, and because of their ability to run fast computations in parallel, became the chip of choice for AI developers.
A global switch from CPUs to GPUs would be the first order of business for every company training frontier models. This requires a massive structural upgrade to the physical locations: an upgrade in energy management, efficiency, security and cooling. More energy.
The data centers we use now, running all over the globe in enormous warehouses, located in deserts or exurbs of large cities, constitute only about 25% of what is projected for the kind of AI inference we expect to use in the future. Even the big AI development houses – Amazon, Microsoft and Google – are still running AI on CPU-based data centers[iii].
Google ’s emissions for 2024 were 50% higher than they were in 2019, causing a significant setback in its goal to achieve net-zero emissions by 2030[iv]. Microsoft’s emissions also jumped up 29% from 2020, while Meta’s emissions jumped 66% from 2021[v].
By 2027, AI inference could be responsible for 85 to 134 terawatt hours (Twh) annually. This is about 0.5% of total global energy use; what Argentina, Sweden and The Netherlands use in a year[vi].

AI Hardware Demands
This is where Nvidia comes in, now one of the world’s most powerful AI supplier companies. Nvidia started as a gaming chip developer, and quickly pivoted to selling its high-capacity GPUs to AI developers. It’s 2023 H100 AI chip required a constant power stream of roughly 400W per chip, while it’s 2024 version, the H200, gobbles up nearly twice that – 700W – which is what a microwave uses to operate[vii]. If a single data megacenter with one million servers replaced all of its servers running CPUs with those running GPUs, the power needed would increase 4-5 times (1.500MW); the equivalent to a nuclear power station[viii].
Nvidia’s Blackwell cluster ships with eight GPUs which pulls in 15.000W of power, double the demands of its previous release[ix]. CEO Huang stated that he expects to see millions of these kinds of AI processors shipping to data centers around the world. One million Blackwell GPUs would use an astonishing 1.875 gigawatts of power; about two nuclear power plant’s worth of energy[x]. Again, more powerful cooling systems need to house these systems: at the current rate, all data centers would need totally new designs on an annual basis.
Global electricity demand could grow 20% by 2030, with AI data centers alone expected to add about 323 terawatt hours of electricity in the US alone[xi]. With AI driving about 19% of all data center power demand, the demand by these centers could grow 160% by 2030[xii].
Around 15% of the world’s data centers are in Europe. Power demand on the continent could grow by 40% – 50% by 2030, matching the current total consumption of Portugal, Greece, and the Netherlands combined[xiii]. But with the world’s oldest power grid, Europe’s ability to keep new data centers electrified will require much more investment than in other parts of the world; about € 800 billion on transmission and distribution over the coming decade, and € 850 to increase solar, wave, geothermal and wind energy[xiv].
How does this explosive need for energy fit with the global climate crisis, and the pressing need to reduce, rather than increase, our energy use? None of the large players have a good answer. Google and Microsoft both proposed that AI would soon become intelligent enough to solve these problems for us, helping us develop high-power batteries, global energy sharing and discover new sources of clean energy our feeble Human brains aren’t, as yet, able to contemplate.
For now, however, we are keeping the coal-fired power plants burning, and nuclear power stations slated for closure, open, both in Europe and the US. AI energy demands spiral upwards. Microsoft and OpenAI are building a new $100 billion, five-gigawatt data center, which will require the equivalent of five nuclear reactors to energize.
The AI revolution will need a complete overhaul of the data center infrastructure from the inside out to keep up with the global need for inference. Where the power will come from, is still an open question.

Need help with AI Integration?
Reach out to me for advice – I have a few nice tricks up my sleeve to help guide you on your way, as well as a few “insiders’ links” I can share to get you that free trial version you need to get started.

No eyeballs to read or watch? Just listen.
Working Humans is a bi-monthly podcast focusing on the AI and Human connection at work. Available on Apple and Spotify.

About Fiona Passantino
Fiona helps empower working Humans with AI integration, leadership and communication. Maximizing connection, engagement and creativity for more joy and inspiration into the workplace. A passionate keynote speaker, trainer, facilitator and coach, she is a prolific content producer, host of the podcast “Working Humans” and award-winning author of the “Comic Books for Executives” series. Her latest book is “The AI-Powered Professional”.
[i] Goldman Sachs (2024) “AI is poised to drive 160% increase in data center power demand” Goldman Sachs. Accessed August 21, 2024. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
[ii] Gelles (2024) “A.I.’s Insatiable Appetite for Energy” The New York Times. Accessed August 21, 2024. https://www.nytimes.com/2024/07/11/climate/artificial-intelligence-energy-usage.html?smid=url-share
[iii] Studer (2023) “The energy challenge of powering AI chips” Robeco. Accessed August 21, 2024. https://www.robeco.com/en-int/insights/2023/11/the-energy-challenge-of-powering-ai-chips
[iv] Bartlett (2024) “Climate Google’s carbon emissions surge nearly 50% due to AI energy demand” CNBC. Accessed August 21, 2024. https://www.cnbc.com/2024/07/02/googles-carbon-emissions-surge-nearly-50percent-due-to-ai-energy-demand.html
[v] Gelles (2024) “A.I.’s Insatiable Appetite for Energy” The New York Times. Accessed August 21, 2024. https://www.nytimes.com/2024/07/11/climate/artificial-intelligence-energy-usage.html?smid=url-share
[vi] Erdenesanaa (2023) “A.I. Could Soon Need as Much Electricity as an Entire Country” New York Times. Accessed August 21, 2024. https://www.nytimes.com/2023/10/10/climate/ai-could-soon-need-as-much-electricity-as-an-entire-country.html
[vii] Studer (2023) “The energy challenge of powering AI chips” Robeco. Accessed August 21, 2024. https://www.robeco.com/en-int/insights/2023/11/the-energy-challenge-of-powering-ai-chips
[viii] Studer (2023) “The energy challenge of powering AI chips” Robeco. Accessed August 21, 2024. https://www.robeco.com/en-int/insights/2023/11/the-energy-challenge-of-powering-ai-chips
[ix] Loeffler (2024) “I watched Nvidia’s Computex 2024 keynote and it made my blood run cold” TechRadar. Accessed August 21, 2024. https://www.techradar.com/computing/i-watched-nvidias-computex-2024-keynote-and-it-made-my-blood-run-cold
[x] Loeffler (2024) “I watched Nvidia’s Computex 2024 keynote and it made my blood run cold” TechRadar. Accessed August 21, 2024. https://www.techradar.com/computing/i-watched-nvidias-computex-2024-keynote-and-it-made-my-blood-run-cold
[xi] Bartlett (2024) “Climate Google’s carbon emissions surge nearly 50% due to AI energy demand” CNBC. Accessed August 21, 2024. https://www.cnbc.com/2024/07/02/googles-carbon-emissions-surge-nearly-50percent-due-to-ai-energy-demand.html
[xii] Goldman Sachs (2024) “AI is poised to drive 160% increase in data center power demand” Goldman Sachs. Accessed August 21, 2024. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
[xiii] Goldman Sachs (2024) “AI is poised to drive 160% increase in data center power demand” Goldman Sachs. Accessed August 21, 2024. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
[xiv] Goldman Sachs (2024) “AI is poised to drive 160% increase in data center power demand” Goldman Sachs. Accessed August 21, 2024. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand