Green Junction
Over the past few years, the “AI movement” has become an integral part of our lives. AI pops up in internet searches, online meetings and numerous other online circumstances, often without our permission. The enormous amount of energy and resources required by data centers for AI has prompted community protests by residents in areas of proposed or approved data centers.
An MIT Technology Review was recently published entitled “We did the math on AI’s energy footprint. Here’s the story you haven’t heard.” It begins with the statement “The emissions from individual AI text, image, and video queries seem small—until you add up what the industry isn’t tracking and consider where it’s heading next.” The authors state that AI is being built into all types of queries without our consent, “from search, to agents, to the mundane daily apps we use to track our fitness, shop online, or book a flight. The energy resources required to power this artificial-intelligence revolution are staggering, and the world’s biggest tech companies have made it a top priority to harness ever more of that energy, aiming to reshape our energy grids in the process.”
The energy demands “are staggering” at a time when energy conservation and clean energy initiatives are critical for addressing climate change. “Meta and Microsoft are working to fire up new nuclear power plants. OpenAI and Donald Trump announced the Stargate initiative, which aims to spend $500 billion—more than the Apollo space program—to build as many as 10 data centers (each of which could require five gigawatts, more than the total power demand from the state of New Hampshire). Apple announced plans to spend $500 billion on manufacturing and data centers in the US over the next four years. Google expects to spend $75 billion on AI infrastructure alone in 2025.”
AI is increasing the carbon footprint of every person who uses the internet. “Data centers are expected to continue trending toward using dirtier, more carbon-intensive forms of energy (like gas) to fill immediate needs, leaving clouds of emissions in their wake. And all of this growth is for a new technology that’s still finding its footing, and in many applications—education, medical advice, legal analysis—might be the wrong tool for the job or at least have a less energy-intensive alternative.” “Generative AI tools are getting practically shoved down our throats and it’s getting harder and harder to opt out, or to make informed choices when it comes to energy and climate.” These are only part of the concerns associated with the AI revolution that currently lacks regulations and guardrails. Share these concerns with your elected officials. https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/
If you are interested, here are more excerpts from the article:
“Gaps in power supply, combined with the rush to build data centers to power AI, often mean shortsighted energy plans. In April, Elon Musk’s X supercomputing center near Memphis was found, via satellite imagery, to be using dozens of methane gas generators that the Southern Environmental Law Center alleges are not approved by energy regulators to supplement grid power and are violating the Clean Air Act.”
“In December, OpenAI said that ChatGPT receives 1 billion messages every day, and after the company launched a new image generator in March, it said that people were using it to generate 78 million images per day, from Studio Ghibli–style portraits to pictures of themselves as Barbie dolls.”
“leading labs are racing us toward a world where AI “agents’ perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.”
“AI models are being added to everything from customer service phone lines to doctor’s offices, rapidly increasing AI’s share of national energy consumption.”
““The precious few numbers that we have may shed a tiny sliver of light on where we stand right now, but all bets are off in the coming years,” says Luccioni. “Generative AI tools are getting practically shoved down our throats and it’s getting harder and harder to opt out, or to make informed choices when it comes to energy and climate.””
“A report published in December by the Lawrence Berkeley National Laboratory, which is funded by the Department of Energy and has produced 16 Nobel Prizes, attempted to measure what AI’s proliferation might mean for energy demand.
By analyzing both public and proprietary data on data centers as a whole, as well as the specific needs of AI, the researchers reached a clear conclusion. Data centers in the US used approximately 200 terawatt-hours of electricity in 2024, roughly equivalent to the amount needed to power Thailand for a year. AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million US homes for a year.
If we imagine that the bulk of it was used for inference, it means that enough electricity was used on AI in the US last year for every person on Earth to have exchanged more than 4,000 messages with chatbots. In reality, of course, average individual users aren’t responsible for all this power demand. Much of it is likely going toward startups and tech giants testing their models, power users exploring every new feature, and energy-heavy tasks like generating videos or avatars.”
“Individuals may end up footing some of the bill for this AI revolution, according to new research published in March. The researchers, from Harvard’s Electricity Law Initiative, analyzed agreements between utility companies and tech giants like Meta that govern how much those companies will pay for power in massive new data centers. They found that discounts utility companies give to Big Tech can raise the electricity rates paid by consumers. In some cases, if certain data centers fail to attract the promised AI business or need less power than expected, ratepayers could still be on the hook for subsidizing them. A 2024 report from the Virginia legislature estimated that average residential ratepayers in the state could pay an additional $37.50 every month in data center energy costs.”
““It’s not clear to us that the benefits of these data centers outweigh these costs,” says Eliza Martin, a legal fellow at the Environmental and Energy Law Program at Harvard and a coauthor of the research. “Why should we be paying for this infrastructure? Why should we be paying for their power bills?””
Julie Peller, Ph.D., is an environmental chemist (Professor of Chemistry at Valparaiso University). She has been writing a weekly column called The Green Junction for the past seven years and is helping to move the call of Laudato Si to action forward. Her Research Interests are advanced oxidation for aqueous solutions, water quality analyses, emerging contaminants, air quality analyses, Lake Michigan shoreline challenges (Cladophora, water, and sediment contaminants), and student and citizen participation in environmental work.
Discover more from Innovate ~ Educate ~ Collaborate
Subscribe to get the latest posts sent to your email.