The Unseen Environmental Impact Of AI - Infrastructure news

Sitting at your computer and entering a prompt into ChatGPT feels far removed from the environmental troubles plaguing the Earth. The truth is that AI – especially the data centres used to train them – have a massive environmental impact.

AI models like ChatGPT require data centres, which have data storage facilities where complex computational scenarios can be run to ‘train’ the AI model. To do this, these data centres require a staggering amount of electricity and water. While data centres have been around since the 1940s, the increase in scale, demand, and number of these centres has raised concerns among climate change academics, and activists alike. Amazon alone has 100 data centres, Google has more than 35, and Microsoft has more than 100.

Elsa Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of Massachusetts Institute of Technology’s (MIT) new Climate Project say that the daily use of AI is having a far bigger impact than people might think.

The numbers

data driven science for AI

AI is already firmly embedded in the economy, meaning that the fourth industrial revolution might need an energy revolution to go with it

Training an AI model – the process of setting parameters and performing billions of inputs to fine-tune the AI in order to perform its tasks – is an energy-dense task. To train ChatGPT 4 (the latest model) requires 62.3188 Gigawatt hours (GWh) of electricity, 42 times more than the previous model, and as these technologies become more sophisticated the energy use increases.

According to MIT, in North America, the operation energy of these data centres grew from “2688 megawatts at the end of 2022 to 5341 megawatts at the end of 2023, (partly) driven by the demands of generative AI.”

In 2022, data centres globally used 460 terawatts, equivalent to the 11th highest electricity consumer in the world. For context, Saudi Arabia used 373 terawatts, and France used 463 in the same year. By 2026, it is projected that data centres will use 1050 terawatts, making them the fifth largest electricity consumer, more than Russia.

Even after these models are trained, they continue to use more energy than traditional computing. ChatGPT uses 25 times more energy than Google per search.

What next

AI’s daily use has exploded worldwide, and it has already embedded itself in the economy and society. Going forward, the big answers will come from green energy solutions, moving to cleaner energy, and finding creative solutions to offset its impact.

Additional Reading?

Request Free Copy