In the coming years, AI could consume as much energy as an entire nation

If 2023 was the year of artificial intelligence, 2024 is the year in which the problem ofenvironmental impact of AI it can no longer be ignored. Already in January, during the annual meeting of World Economic Forum in Switzerland, the CEO of OpenAI – the company that owns ChatGPT – Sam Altman he expressed his opinion on the energy challenges that afflict this sector, stating that “a turning point is needed” to reduce the consumption of energy and drinking water linked to AI. Also during the WEF, the president of Hitachi Toshihaki Higashihara stated that, according to one estimate, i data centers (computer computing centers made up of thousands of servers) will consume electricity in 2050 1000 times higher than today, also due to the enormous energy hunger of AI. The issue has reached the US House of Representatives, where a bill to combat the heavy environmental cost of artificial intelligence was signed into law last month.

How much energy does artificial intelligence consume?

To ensure the training and the answers to millions of queries that are sent every day to artificial intelligence tools such as ChatGPT or Google Gemini, enormous data centers are needed whose energy consumption is anything but negligible, both to keep the computers active and to power their cooling systems.

For example, it is estimated that receiving a response from a generative artificial intelligence tool consumes consumption 4-5 times more compared to a traditional web search in terms of energy. According to an estimate produced by a research team at the University of Washington led by Sajjad Moazenithe most famous AI chatbot – ChatGPT – needs approx 1 gigawatt hour (GWh) per daythe equivalent of the consumption of 33,000 average US households. In this regard, Moazeni stated:

These numbers may seem acceptable for now, but it is only the beginning of widespread development and adoption of these models. (…) Furthermore, as models become more sophisticated, they become larger and larger, meaning that data center energy for training and operating these models can become unsustainable.

Estimating how much energy the entire AI sector will consume in the coming years is a very complicated undertaking, but reasonable hypotheses can be made. A recent study, for example, estimates that the AI ​​sector could have an included energy requirement in 2027 between 85 and 134 terawatt hours (TWh) per year, comparable with that of some nations such as Ukraine (85 TWh per year), Holland (108 TWh), Sweden (125 TWh) and Argentina (134 TWh). For comparison, Italy’s energy consumption stands at around 300 TWh per year.

Artificial intelligence water consumption

Computer centers not only consume electricity, but also water which is mainly used for server cooling systems. As an example, training Google Bard’s Large Language Model (LLM) resulted in a increase in water consumption by 20% by the Mountain View giant. Similar situation with Microsoft and Bing’s artificial intelligence, with an increase of 34%. Training the LLM of ChatGPT-4 – which recently passed the Turing Test – caused a 6% increase in water consumption in West Des Moines, Iowa, where the data center used by OpenAI resides. A recent study estimated that AI will be able to consume between 4.2 and 6.6 billion cubic meters of water in 2027equivalent to half the UK’s annual consumption.

How can we reduce the environmental impact of AI?

As is easy to imagine, There is no “magic pill” to instantly solve the problem: the solution must necessarily come from multiple fronts. From a purely technical point of view, we will first need more efficient algorithms and lower consumption hardware; the use of renewable energy will also be a priority. Then there is the legislative aspect, with limits and objectives for the consumption of electricity and water but also incentives for more efficient designs and the use of renewable sources.