The human brain is one of the most efficient information processing systems ever observed, consuming just 20 watts to power 86 billion neurons, managing cognitive tasks of the highest complexity.
With a mass of approximately 1.3–1.4 kg, equal to just over 2% of body weight, the human brain consumes approximately 20% of the body’s basal metabolism. In terms of power, this corresponds to ~20 continuous average watts.
A surprisingly low value when compared to the quantity of operations that the brain carries out in parallel: motor control, sensory integration, autonomic regulation, learning, memory and decision making. The comparison with modern artificial intelligence systems highlights a fundamental difference: energy efficiency does not depend only on what is calculated, but on how the information is represented and transformed.
How much the human brain really consumes
The brain’s energy consumption is surprisingly stable. Whether we are lying on the sofa, concentrated on a math problem or immersed in a conversation, the power absorbed remains around 20 watts.
This happens because the brain does not “switch on” new parts when we think more: it is always active, and cognitive work corresponds above all to a reorganization of neuronal activity patterns, not to a drastic increase in consumption. From the point of view of brain energy consumption, the most wasteful processes are:
- Maintenance of ionic gradients (Na⁺, K⁺, Ca²⁺) across neuronal membranes.
- Synaptic transmission, particularly the release and recycling of neurotransmitters.
- Propagation of action potentials along axons.
The main fuel is glucose and experimental estimates indicate that most of the energy is spent on keeping the system ready to respond, i.e. in a dynamic state close to functional equilibrium. Keeping a network ready to respond at all times has a basic cost that cannot be eliminated, but which evolution has made extremely efficient.
A crucial aspect is that:
- the brain is event-driven: neurons consume energy mainly when they transmit signals;
- the activity is sparse and asynchronous, not all neurons are active at the same time;
- processing is massively parallel.
This explains why total consumption remains almost constant between rest and complex cognitive tasks.
How much artificial intelligence consumes
If we shift attention to artificial intelligence, the picture changes radically. Systems like ChatGPT are implemented on digital computers and based on deep artificial neural networks. From a computational point of view, the functioning of these models is a long sequence of explicit mathematical operations: multiplications and additions between numbers, organized in layers. Each time the model generates a response, these operations must be performed again.
Every single operation has a well-defined energy cost, because it requires the physical movement of electric charge in the transistors. Unlike the brain, which exploits physical dynamics already present in the neural network, artificial intelligence must recalculate everything step by step. The consumption of a single response can be relatively low, but the key point is that this cost repeats for each request, without ever vanishing.
The hidden weight of infrastructure
The true power consumption of AI emerges when you look beyond the single response. Large models are trained on clusters of thousands of processors for weeks or months, with power in the megawatt range. The total energy expended in this phase can reach hundreds of thousands or millions of kilowatt hours. Added to this is the permanent cost of data centers: servers always on, cooling systems, redundancy and reliability. Even when no one is querying the model, the infrastructure remains operational. It is continuous consumption, which does not directly depend on instantaneous use.
Architecture: where the difference really comes from
The fundamental difference between brain and artificial intelligence is not only quantitative, but structural. In digital computers, memory and computation are separate. Data must be continuously transferred from memory to computing units and vice versa. This movement of information in space is one of the most expensive operations from an energy point of view. In the human brain, however, memory and calculation coincide.
Information is embedded in synapses and network organization. Learning means changing connections locally, not moving data from a central repository. From a physical perspective, this dramatically reduces the energy cost of computing.
This difference becomes even more evident if we consider learning: artificial intelligence learns above all during a dedicated phase, training, which is separate from use and extremely energy-intensive. Once training is complete, the model is used, but the actual learning has already occurred.
The human brain, in contrast, continually learns as it works. There is no phase in which he stops operating to train.
If we make a very rough estimate and consider an average consumption of 20 watts for the first 20 years of life, we obtain a total energy of the order of 3,500 kilowatt hours. A value comparable to the electricity consumption of an apartment for a few months.
At the basis of all this there is a difference in history. The human brain is the result of millions of years of evolution, during which energy efficiency has been a fundamental selective pressure. Space and energy were limited resources.
Artificial intelligence, on the other hand, is the product of a few decades of technological development, in which energy has often been treated as a scalable resource: if more power is needed, new servers are added.
The comparison between the human brain and artificial intelligence does not tell us that one is “smarter” than the other. It shows us something deeper: intelligence is always embodied in a physical structure, and its energetic cost depends on how that structure is made.
Artificial intelligence is a very powerful tool, but it pays for its power with high energy consumption. The human brain, however, represents an extraordinary compromise between computational capacity, robustness and efficiency.









