Skip to the content

AI projected to consume 3.5% of global electricity by 2030

Artificial intelligence has become more and more a staple in our everyday lives. With the introduction of ChatGPT, the wonders of AI were introduced to the masses. Whether it is your trusted partner at work to boost efficiency or a personal chef that tells you what to cook with those leftover ingredients in the fridge, AI is there for you to solve the problem. The societal impact of this new technology is tremendous, but unfortunately, the same goes for its climate consequences.

 

Energy and resource consumption in the age of language models

Artificial intelligence was not invented on November 20th, 2022 when OpenAI launched ChatGPT. The theory and techniques of AI were established and already used in scientific research long before AI became a buzz word. ChatGPT was, however, a turning point in showing what AI, and more specifically language models, were capable of. AI went from a concept often seen in science fiction movies by most people to something palpable with a direct impact on our everyday lives. As often goes with these technologies, once the word is out, it further develops at an accelerated rate. Almost every new product that gets launched nowadays needs to have something to do with AI, or else, did you really try? This excessive greed for AI applications does come with a cost. Training a language model like ChatGPT requires copious amounts of energy. To get the previous version of ChatGPT on point, GPT3, around 1300 MWh was used according to Stanford University. Using ChatGPT itself therefore requires 25 times more energy than a simple Google search. The American consulting firm Gartner projected that 3.5% of the worldwide electricity demand will be used for AI by 2030, twice the power demand of France. Furthermore, AI has an impact on water consumption; a casual conversation with the chat bot quickly evaporates half a liter water, which bears the question if its use is always justified. To answer this question, we need to know how these language models work.

 

Balancing efficiency and sustainability

Imagine a library containing every book ever written by mankind. Before the notion of any artificial intelligence, one could find information by going into the library, search for the book you want, and read exactly what was written down by the author. The library then decides to hire a person, call him Al, that needs to memorize all the books. You now have the option to get your information either by searching the library or directly asking Al. Asking Al has two implications: you are not only forcing him to remember your specific topic, he is also formulating sentences from scratch, just like anyone else telling a story about a topic they learned. Obviously, it requires less brain power to look for a copy of the autobiography of Einstein in the library than asking Al to come up with a story, not even mentioning the fact all the energy drinks he had to go through to memorize all the books in the first place. Al becomes highly useful, however, when we have the need to combine books from the library. Rather than going through 20 books on your own and trying to connect their topics, just asking Al will be way more efficient. Thus, a justified use of ChatGPT is when we can leverage the whole functionality and not just a subset of its services. For a quick biography, a Wikipedia web search will be far more efficient. The same deliberation between efficiency and ease of use needs to be done for every other form of AI as well to make it sustainable. 

 

Proactive policies for sustainable technological growth

The surge of artificial intelligence has exposed another weak spot for technological development and climate policies: act first, ask climate questions later. Firstly, it is important that the climate impact of new technologies is assessed in advance and integrated into the design process, and that companies have incentives to do this. Communication around this is also important, so that users have the guidelines on how to use new technologies sustainably. However, the main takeaway is that energy policies and forecasts should not overly rely on the current situation and extend this to 2050. One must account for so-called “black swans”: unexpected, disruptive events that can have a massive impact on our society. Consider the disruptive effect of ChatGPT on education, copywriting, administration, and many other fields. Or what the impact might be if self-driving cars fully take off?

 

AI for a greener future

This is not a plea to stop using AI, on the contrary. Artificial intelligence can also work in favor of our climate. Optimizing energy use, transmission and generation are perfectly suited for these techniques for example. The vast amount of data and variables needed to understand the mechanisms of climate change itself, is something every theoretical model struggles with without the help of AI. Developments in energy efficiency and usage of training in AI models in the future could help lower its climate footprint as well. The benefits of having AI consequently far outweigh the drawbacks when used correctly. Official guidelines or even terms of use could help people be more aware of the climate impact of AI. You don’t always need a calculator to add one and one together.

About the author

Ruben Vandewouer

comments powered by Disqus