Sitting at your computer and entering a prompt into ChatGPT feels far removed from the environmental troubles plaguing the Earth. The truth is that AI – especially the data centres used to train them – have a massive environmental impact.
AI models like ChatGPT require data centres, which have data storage facilities where complex computational scenarios can be run to ‘train’ the AI model. To do this, these data centres require a staggering amount of electricity and water. While data centres have been around since the 1940s, the increase in scale, demand, and number of these centres has raised concerns among climate change academics, and activists alike. Amazon alone has 100 data centres, Google has more than 35, and Microsoft has more than 100. Elsa Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of Massachusetts Institute of Technology’s (MIT) new Climate Project say that the daily use of AI is having a far bigger impact than people might think.The numbers

AI is already firmly embedded in the economy, meaning that the fourth industrial revolution might need an energy revolution to go with it
According to MIT, in North America, the operation energy of these data centres grew from “2688 megawatts at the end of 2022 to 5341 megawatts at the end of 2023, (partly) driven by the demands of generative AI.”
In 2022, data centres globally used 460 terawatts, equivalent to the 11th highest electricity consumer in the world. For context, Saudi Arabia used 373 terawatts, and France used 463 in the same year. By 2026, it is projected that data centres will use 1050 terawatts, making them the fifth largest electricity consumer, more than Russia. Even after these models are trained, they continue to use more energy than traditional computing. ChatGPT uses 25 times more energy than Google per search.