How much electricity really consumes artificial intelligence: the EIA and EMIT estimates

Chatbots and virtual assistants, mathematical calculations resolved in a short time and images generated within a few seconds: artificial intelligence (AI) has now become part of our daily lives. Precisely for this reason, it is spontaneous to ask: but how much does the AI really consume?

Knowing with certainty the real consumption of artificial intelligence is quite difficult: the tech giants are not very transparent about it and the estimates published in recent months have never been confirmed. Easier to calculate, however, are the consumption of the data centers, that is, the digital infrastructures underlying everything, essential for the correct functioning of the AI. According to a study published on Mit Technology Reviewby 2028 more than half of the electricity intended for the data centers will be used for artificial intelligence: this means that the AI, alone, could consume annually a quantity of electricity equal to 22 % of the energy needs of US families.

So let’s see the official estimates of the International Energy Agency and the growth forecasts for artificial intelligence until 2030.

Energy estimates consumed by artificial intelligence

As mentioned, there are no official data on artificial intelligence consumption: consequently, all the information we have on the energy impact is also estimated estimates.

Last February, for example, the EPOCH artificial intelligence research company published data (never confirmed by Openi) on the amount of energy used for a single demand for chatgpt: according to calculations, each message would consume about 0.3 Wattora, the equivalent of a 10 W LED bulb on for about 2 minutes. Generating a billion messages per day for a year would therefore mean a consumption of over 109 gigawattara (GWH) of electricity, sufficient to feed 10,400 US houses for a year.

The consumption of data centers are easier to measure: according to the data processed by the International Energy Agency (International Energy AgencyIEA), globally in 2024, the data centers consumed 1.5% of world electricity. These are the digital infrastructures underlying each online purchase, each item published on the Internet or of each lesson followed remotely. If between 2005 and 2017, the electricity consumed by the data centers has remained stable (despite greater technological development), in the last 5 years, the consumption of data centers have grown by 12 % per year, becoming essential infrastructures also for artificial intelligence models and arriving the total consumption of electricity in 2023.

In general, if a conventional data center consumes between 10 and 25 megawatt (MW), a Hyperscale database (i.e. a large data center, built to support hundreds of thousands of interconnected servers) consumes about 100 MW.

The problem is that these infrastructures are often grouped into large clusters, with consequent inconvenience for the local production of electricity and imbalances in the management of electricity. In Ireland, for example, the data centers consume about 20% of the national energy needs, while in 6 US states, these infrastructures consume 10% of the total electricity total, with the state of Virginia in the lead with 25%.

Data Center distribution in the world. Credit: iea

What the consumption of the AI depend on

In general, however, to better understand estimates on AI consumption it is possible to distinguish between two different phases:

Consumption to train the models of AI

In recent years, the amount of data and calculations necessary to train cutting-edge artificial intelligence models has grown exponentially: according to estimates, training data for the GPT-4 model correspond to about 4.9 trillions of pure data.

These training calculations are performed on specialized computer chips such as GPUs (Graphics Processing Unitfundamental for the development of the AI): a single GPU can have a maximum consumption of nominal energy of 1,000 watts (in the case of the most recent and more powerful chip), which corresponds to the consumption of a toaster.

The large models, however, are trained using numerous GPUs: always taking GPT-4, this model has been trained on 25,000 GPU with a combined nominal power of about 10 MW. If then all this is also the amount of energy required by the cooling equipment, the total nominal power of the equipment used to train GPT-4 is about 22 MW, which corresponds to the energy absorption of about 150 top-power charging stations for high power electric vehicles.

In short, the training of the larger artificial intelligence model requires an energy consumption of about 154 MW; The cumulative consumption of training for large artificial intelligence models is estimated at around 1,700 GWh. To understand each other, an Italian type family consumes on average 2,700 kWh per year (Arera data for a family unit of 4 components, which resides in a house in the climatic zone E): this means that the cumulative consumption of large AI models equates to the annual consumption of about 630,000 Italian families.

Consumption related to the use of the models of AI

The amount of energy used by AI models for each interaction depends on several factors, including:

  • The length of the user demand (input) and the response of the AI model (output): of course, the more long the answers are and require calculations, the higher the consumption of energy is high.
  • The size of the model: the larger models require more calculations to process input and output and, consequently, consume more electricity.
  • Input and output methods: the generation of videos and images is an activity that requires more calculations than the generation of text and therefore consumes more.

The differences in consumption to produce texts, images and videos

In general, therefore, the consumption of artificial intelligence models depend on the requests by users: according to the AIA data, generating a video of a few seconds may require the same amount of energy necessary to upload a computer twice.

According to estimates, a small linguistic model consumes approximately 0.3 WH to generate a text, against the 5 WH of medium -sized linguistic models. To recreate an image, however, approximately 1.7 WH are consumed, while to generate a short video (6 seconds) and at least 115 WH are consumed of low quality.

To get an idea, uploading a smartphone requires an energy expenditure of 15 WH, which go to 60 WH if the device to be loaded is a computer.

The estimates of artificial intelligence consumption based on the generated output (text or image), compared with consumption to recharge a mobile phone and a computer. Credit: iea

The large linguistic models (chatgpt, Gemini, Deepseek etc.), then, tend to have better performance in terms of precision and quality, but at the same time they consume much more. According to the tests conducted by IEA, a model of generation of images could generate about 55 trillion images with about 100 input TWH, about half of the annual energy needs of countries such as Iceland.

Last March, when Chatgpt began to generate images inspired by the graphics of the Ghibli studio, the model generated 78 million images in a single day. If you then consider that, in 2025, Chatgpt has been confirmed as the fifth most visited site in the world (even exceeding WhatsApp), it is clear that the threshold of the 55 trillions of images generated is not so difficult to achieve.

The growth of AI consumption until 2030

As mentioned, in 2024 the data centers consumed about 1.5% of the global electricity: according to the projections of the AIA, by 2030 the electricity consumed by the data centers will exceed the 945 TWH, which correspond to more than the triple of the energy needs of Italy (which is attested around 300 TWH).

However, it is necessary to specify. That not all the electricity required to feed the data centers depends on the AI: however, identifying precisely the demand for electricity that derives from the AI is increasingly difficult.

In a study conducted by the Mit Technology Review, by 2028 more than half of the electricity intended for the data centers will be used for artificial intelligence: at that point, the Ai alone could consume annually a quantity of electricity equal to 22% of all US families.

Estimates consumption at until 2030
The esteemed impact of AI on electricity consumption in data centers from 2020 to 2030. Credit: Iea

This surge will also depend on the fact that, in the future, we will not use artificial intelligence only to answer daily questions or to generate photos, but the models will become real personalized agents based on user preferences, capable of carrying out tasks without human control and solving increasingly complex problems.

The relationship between energy and artificial intelligence

The solution, however, is certainly not to interrupt the use of AI, also because energy and artificial intelligence depend strictly on each other: if it is true that it does not exist to without energy (especially electric), it is also true that artificial intelligence has the potential to transform the future of the energy sector.

As IEA also points out, in fact, artificial intelligence can also be used to improve energy efficiency and reduce costs. For example, thanks to the use of AI the weather forecasts will become increasingly accurate, thus allowing to foresee the energy produced by wind or photovoltaic systems with greater precision, monitoring in real time and optimizing the transmission lines.

The problem, therefore, is to find the right energy mix to feed the data centers uninterruptedly (which host the infrastructures necessary for the functioning of the AI), which must remain active 24 hours a day, 365 days a year. In other words, the data centers cannot depend on intermittent energy sources, such as renewables, which I am not able to produce energy at any time.

Tech giants such as Meta, Amazon and Google are trying to remedy the problem by investing in nuclear energy, focusing mainly on new modular technologies (the much discussed Small Modular Reactors, SMR), even if the construction of these new infrastructures will take several years.