Generative AI’s Energy Problem Today Is Foundational

Before AI can take over, it will need to find a new approach to energy

3 min read
An animated gif of a frog on an endless loop sticking out it's tongue and sucking in 0s and 1s.
Nicholas Little

While the origins of artificial intelligence can be traced back more than 60 years to the mid-20th century, the explosion of generative AI products like ChatGPT or Midjourney in the past two years has brought the technology to a new level of popularity. But that popularity comes at a steep energy cost, a reality of operations today that is often shunted to the margins and left unsaid.

Alex de Vries is a Ph.D. candidate at VU Amsterdam and founder of the digital-sustainability blog Digiconomist. In a report published earlier this month in Joule, de Vries has analyzed trends in AI energy use. He predicted that current AI technology could be on track to annually consume as much electricity as the entire country of Ireland (29.3 terawatt-hours per year).

“A single LLM interaction may consume as much power as leaving a low-brightness LED lightbulb on for one hour.”
—Alex de Vries, VU Amsterdam

Many generative AI tools rely on a type of natural-language processing called large language models (LLMs) to first learn and then make inferences about languages and linguistic structures (like code or legal-case prediction) used throughout the world. While the training process of these LLMs typically receives the brunt of environmental concern—models can consume many terabytes of data and use over 1,000 megawatt-hours of electricity—de Vries’s report highlights that in some cases electricity consumed while making inferences may be even higher.

“You could say that a single LLM interaction may consume as much power as leaving a low-brightness LED lightbulb on for one hour,” de Vries says.

Roberto Verdecchia is an assistant professor at the University of Florence and the first author of a paper publishedearlier this year on developing green AI solutions. He says that de Vries’s predictions may even be conservative when it comes to the true cost of AI, especially when considering the nonstandardized regulation surrounding this technology.

“I would not be surprised if also these predictions will prove to be correct, potentially even sooner than expected,” he says. “Considering the general IT environmental sustainability trends throughout the years, and the recent popularization of LLMs, the predictions might be even deemed as conservative.”

AI’s energy problem has historically been approached through optimizing hardware, says Verdecchia. However, continuing to make microelectronics smaller and more efficient is becoming “physically impossible,” he says.

In his paper, published in the journal WIREs Data Mining and Knowledge Discovery, Verdecchia and colleagues highlight several algorithmic approaches that experts are taking instead. These include improving data-collection and processing techniques, choosing more-efficient libraries, and improving the efficiency of training algorithms.

“The solutions report impressive energy savings, often at a negligible or even null deterioration of the AI algorithms’ precision,” Verdecchia says.

Yet, even with work underway to improve the sustainability of AI products, de Vries says, these solutions may still only succeed in helping the reach of these AIs to grow even further.

“In the race to produce faster and more-accurate AI models, environmental sustainability is often regarded as a second-class citizen.”
—Roberto Verdecchia, University of Florence

We need to consider rebound effects, de Vries says, such as “increasing efficiency leading to more consumer demand, leading to an increase in total resource usage and the fact that AI efficiency gains may also lead to even bigger models requiring more computational power.”

Ultimately, de Vries and Verdecchia concur, human self-regulation could play an equally important role in curbing the slope of AI’s energy consumption. For example, developers will need to decide whether eking out another precision point from their model is with the jump in that model’s environmental impact, Verdecchia says.

Unfortunately, this kind of self-restraint may be easier said than done, particularly when the market demands newer and better products.

“In the race to produce faster and more-accurate AI models, environmental sustainability is often regarded as a second-class citizen,” Verdecchia says.

De Vries argues that developers should also think critically about what products really need AI integration. For example, de Vries’s paper estimates that it would cost Google US $100 billion in server costs alone if the search engine were to incorporate AI inference into every single one of its Web searches.

“I think the biggest responsibility is with institutions that are currently forcing AI on all kinds of solutions, regardless of whether it is the best fit, [because they’re] influenced by hype and fear of missing out,” he says. “It will be crucial to realize that AI is not a miracle cure and has its own limitations.”

As for users asking ChatGPT to write silly stories or generate fantastical photos, Verdecchia says these individual habits are not going to make or break the environmental impact of these products. That said, thinking and speaking critically about the impact of these products could help push the needle in the right direction as developers work behind the scenes.

“Pushing for a clear, transparent, and comparable monitoring and reporting of AI sustainability is the first step required to make AI more environmentally sustainable,” Verdecchia says.

The Conversation (1)
Nicolas DUJARRIER
Nicolas DUJARRIER29 Oct, 2023
INDV

Even though there would likely be some rebound effect, and although there are technical challenges, the author seems to be discarding plenty of hardware opportunities to improve AI efficiency.

In the digital domain, the combination of spintronics related technologies (MRAM, Intel MESO concept,…) and new architecture could likely help gain 100x to 1000x efficiency.

One step further, going through the analog domain (like the start-up Rain Neuromorphics) could likely help gain even more, 100x to 1000 000x.

The more pain there is, the more incentive will the Big Tech have to explore those as of now unconventionnal alternatives.