Comparing the energy of the human brain and artificial intelligence

It is often remarked, sometimes with unease, that artificial intelligence consumes enormous amounts of energy while the human brain runs on little more than the power of a small light bulb. The contrast is striking. But before we marvel too quickly at the efficiency of the brain, we should pause and ask whether we are making a fair comparison.
What exactly is the human equivalent of an AI system answering a question?
The simplest comparison would be a single answer. You ask a question and either a person or an AI replies. But that framing quietly hides a large part of the machine’s cost. An AI answer depends on a vast training process that took place earlier in large data centres. The electricity used to train the model does not appear in the moment of answering. If we compare a human reply with a single AI response, we are ignoring the energy required to build the machine’s knowledge in the first place.
Another possibility is to compare a trained AI model with a trained human expert. A PhD, for example, represents decades of learning. The AI equivalent is its training phase, during which the model absorbs enormous amounts of text and data. Both systems require a long investment before they are able to produce sophisticated answers.
We could widen the frame further. AI models are trained on the accumulated output of millions of people: books, research papers, code, and conversations. In that sense an AI model resembles a compressed form of collective knowledge. The human comparison might not be a single expert at all, but something closer to a research community.
There is an even deeper perspective. Human intelligence itself is the product of hundreds of millions of years of evolution. If we tried to account for the energy required to evolve brains capable of language and reasoning, biological intelligence would hardly look inexpensive.
For practical purposes, however, the clearest comparison is this: a trained AI model and a trained human expert.
Once we make that comparison, the numbers become interesting. Training a frontier AI model today can require several million kilowatt-hours of electricity. The cost is paid up front during training, after which the model can generate answers at relatively low additional cost.
The human brain, by contrast, runs on about twenty watts of power. Over a full day that amounts to roughly half a kilowatt-hour. Within that modest energy budget the brain performs perception, memory, learning, language, and reasoning.
The real puzzle is not that AI systems use a great deal of energy. Modern computers were built for speed and scale, not for metabolic thrift. The deeper puzzle is why the brain is so efficient.
Evolution had a strict energy budget. Brains that wasted energy did not survive. Neurons fire sparsely, meaning most of the brain is quiet most of the time. Memory and computation happen in the same place, reducing the need to move information around. And the brain relies heavily on prediction, focusing effort on what changes rather than recalculating everything continuously.
The result is a form of intelligence that runs steadily on the power of a small light bulb.
Perhaps the real surprise is not that artificial intelligence consumes so much energy, but that human intelligence runs on just twenty watts.
Last Updated on March 13, 2026 | Published: March 13, 2026