Estimating the Energy and Environmental Impact of AI

By Mark Miller
Have you ever wondered how much energy is used—and what the carbon consequences might be—when you ask AI for text, an image, or a video? James O'Donnell and Casey Crownhart at MIT Technology Review did. What they learned is in their report, “We did the math on AI’s energy footprint. Here’s the story you haven’t heard.”
Data Center Driver
Before you can ask AI something, AI computer models have to be trained in a data center to respond. This fundamental training is driving demand for more data centers, and the energy-intensive hardware needed for AI means more energy will be required to power them.
By 2028, based on projections in the 2024 United States Data Center Energy Usage Report from Lawrence Berkeley National Laboratory, over half of the electricity powering data centers will be used for AI. And this energy may not be very carbon friendly. The report from the Massachusetts Institute of Technology (MIT) tells us that “the carbon intensity of electricity used by data centers was 48 percent higher than the U.S. average.” Carbon intensity is “how many grams of carbon dioxide emissions are produced for each kilowatt-hour of electricity consumed.”
Inferences and Output
Once AI models are trained, you can ask them to do things by making queries. These are called “inferences.” The reporters at MIT arrived at the following potential energy usage figures for open-source text, image, and video queries.
Text = 6,706 joules or the energy to power a microwave for 8 seconds
Image = 4,402 joules or the energy to power a microwave for 5.5 seconds
Video = 3.4 million joules or the energy to power a microwave for 1 hour
Those figures might not sound like much, but inferences may use 80 to 90 percent of the computing power for AI. If AI-specific servers spent their energy on inferences, all of those queries could add up to enough energy consumed in the U.S. in 2024 for each person on Earth to execute 4,000 messages with chatbots—although much of this energy may go to purposes other than inferences, qualifies the MIT study.
Conclusions and Questions
The MIT investigation relies on the research from Lawrence Berkeley National Laboratory to estimate that, by 2028, the energy required by AI-specific tasks will exceed what it takes to “power 22 percent of U.S. households each year” and “that could generate the same emissions as driving over 300 billion miles.”
While these findings are enlightening, they may not tell the whole story. O'Donnell and Crownhart admit that “the common understanding of AI’s energy consumption is full of holes.” Are the queries being made by individuals or are they buried in applications? How can energy sources be tracked more reliably? In addition, technology companies, data centers, utility companies, and others are not providing sufficient information to make more accurate measurements. But one thing seems sure: Energy use associated with AI promises to be significant, and the impact that it may have on our future sustainability is perhaps even more so.
Discussion Questions
- How often do you use AI every day?
- Why does it take less energy to generate an image than text?
- How important is it to measure carbon intensity when determining environmental impact?