Anna Pillar

Data Scientist Textmetrics
June 13, 2024

View all blogs

Blogs

AI and Sustainability

If we were to summarize the topics of this century that cause heated debates and many headlines, AI and Sustainability would be high up on the agenda. However, how often do we really talk about their relationship to each other? In my opinion, not enough.

In this blog post, I want to shed some light on how these two topics are linked and how their relationship has a dark and a light side, like almost everything in life. Unsurprisingly, AI, like most technical advancements, has a massive impact on our climate. But AI also offers solutions that help us to minimize our impact on our planet.

 

AI is hungry

AI models have grown massively in the last few years, and with it, their appetite for resources. To understand the environmental footprint of an AI Model, we have to look at two main factors: how a model is built and how it is used.

When looking at how AI models are created, we look at their training. This is the phase in which the model is given a plethora of data to learn. For Large Language Models (LLMs), the type of models that are used for generative AI like in ChatGPT, that means terabytes of data. Unsurprisingly, processing such an amount of data costs a lot of money.

Let’s take a look at a couple of numbers about the energy consumption. Huggingface, a platform for Open-Source AI models trained their own LLM, called BLOOM, and published a detailed report about the energy that was used during training. BLOOM is one of the larger LLMs, even though being significantly smaller than giant models such as Google’s Gemini model and OpenAI’s GPT-4.

The training of BLOOM required 433.196 kWh. To put this undoubtedly big number into perspective, it is equivalent to the yearly energy consumption of 70 people in the Netherlands*. But it doesn’t stop here. When developing a LLM, the training phase is only one part of a whole process. Since the training is so expensive, it is usually preceded by a series of smaller scale experiments and partial training sessions, as well as a lot of initial research. This means that researchers would train a series of intermediate, smaller models before they start training the full-scale model. In the case of the Bloom model, all the steps taken before the actual training required more than double the amount of energy than the final model training. This brings the total energy consumption of the model development to 1,163,088 kWh (roughly equals the annual energy usage of 180 Dutch people). Even though the final training by itself is the most energy-intensive step, it is only responsible for 30% of the total energy consumption during model development.

What many people underestimated for quite some time, however, is the amount of energy it costs to run these models for inference. Inference in basic terms is the usage of the model in an application. In other words, when you are writing your prompt to query an LLM, you are making an “inference call” to the model. Training a model is a bit more energy intensive. That is, loosely said, because the model still has to “learn” in this phase, which is quite resource intensive. When a model is fully trained, generating an answer to a prompt costs slightly less resources than during the training procedure.

AI models have become more and more integrated into our lives, and for many, not a day goes by without using an AI tool in some form or another. They are on our computers, phones, tablets, they have control over our house, our music selection, what videos we are watching and how we get from a to b. But with the increased usage comes increased energy consumption.

Again using LLM as an example: ChatGPT is one of the fastest growing services ever. Thus, it comes as no surprise that the energy that is required for running the models quickly exceeds the energy that was used in training. The research firm SemiAnalysis reports the energy usage of ChatGPT to be around 564 000 kWh per day**. ChatGPT is using the amount of energy per day that 92 Dutch people use per year.

This all paints a very dark picture of AI and what AI does to our environment. But there is also a light side. A side where AI is actively used to make our world greener.

* Per capita consumption of energy in 2022 for the average Dutch person was 6129 kWh https://www.enerdata.net/estore/energy-market/netherlands/

** Source: https://www.semianalysis.com/p/the-inference-cost-of-search-disruption

The Green side of AI

Even though AI is currently mostly associated with Chatbots that either generate texts or images, AI can do so much more and can be applied to nearly every area, also sustainability. AI’s biggest strength is that it can make sense of data that would take humans an eternity to process and do cognitive tasks for way lower costs than any human workforce could do. What that means for our job market is another discussion. But if applied correctly, it can be used to solve tasks which normally, there would be no resources for. Something that is especially helpful when it comes to areas like social good and sustainability.

For example, computer vision can be used to identify different materials in our trash during the recycling process, which allows us to recycle much more efficiently.

Optimizing techniques can help to determine the most emission friendly route for delivery services and other suppliers; whole supply chains can be optimized through AI like this, resulting in lower costs for the companies and lower emissions for our environment.

Nearly comically, considering how much energy AI systems are using, one big aspect in which AI can have an impact on our environment is in electric grid management. AI can forecast the energy consumption demand, which helps to reduce the waste of energy and therefore creates a more sustainable grid.

Talking about prediction, not only can AI fight climate change by offering sustainable solutions, it can also help us live with the consequences of climate change we are facing right now. Big topics in this area are forecasting future weather trends, predicting climate and oceanic phenomena, or helping us to predict and understand the weather. With AI, we have the option to simulate many scenarios and test a vast number of counter actions to see how we can counteract the disastrous natural catastrophes which have become more and more frequent already.

 

But is it now good or bad?

There are many examples and amazing projects that showcase how AI can be used in a responsible, ethical way to help us live with the damage we have already done to our planet and hopefully further reduce it. But the massive burden that big AI programs are to our climate cannot be justified by this. A lot can still be done and needs to be done to reduce our human footprints on our planet. AI surely has a big impact on this, but whether it will ultimately be for the better or worse is yet to be seen. All we can do is try to shift the balance slightly to the light, green side.

General Sources and further reading:

de Vries, A. (2023). The growing energy footprint of artificial intelligence. Joule, 7(10), 2191-2194.
Cowls, J., Tsamados, A., Taddeo, M., & Floridi, L. (2023). The AI gambit: leveraging artificial intelligence to combat climate change—opportunities, challenges, and recommendations. Ai & Society, 1-25

Your privacy is important to us

We are committed to ensuring the confidentiality, integrity, and availability of information and data. We make every effort to ensure that all data assets are fully protected, following applicable laws, regulations and industry best practices.

Download our ISO Certificate
Read our privacy policy

Happy to meet you at our next event!

At Textmetrics, we love to participate in various events and special occasions actively. We are often present and eager to make new connections and share experiences. We look forward to welcoming you to the upcoming events we will be partcipating in.

Click here to see which events we will be attending!

Share This