Because responding “thank you” and “please” to artificial intelligence has a huge energetic impact

How many times, while making a request to the artificial intelligence, do we use “please”? And how many more do we thank language learning models for their response? Our tendency to “humanize” AI comes at a price: responding “thank you” or “please” to ChatGPT, Gemini or other similar models has a significant energy impact, both in terms of electricity and water consumed – and of course, money. But how is it possible that kind words have an energetic impact? In this article we explain what happens.

How much the use of kind words with AI models consumes

Using “please” and “thank you” has a greater energy impact for a simple reason: every single request to a chatbot costs money and energy and, similarly, every additional word increases the energy consumed by the server. In other words, all “thank you” or “please” must be processed by the system: each calculation by the system, however, consumes energy and costs money.

The CEO of ChatGPT himself, Sam Altman, confirmed this huge waste of energy by responding to the doubt of an X user, who wondered how much money OpenAI had lost in energy costs caused by all the people saying “please” and “thank you” to the artificial intelligence models.

To this question, Sam Altman replied: “tens of millions of dollars” to which he added an ironic “well spent, you never know”.

Among other things, knowing the actual consumption of AI is not easy, given that the companies involved are not very transparent about the data. According to estimates from the International Energy Agency (IEA), an artificial intelligence model consumes approximately 0.3 Wh to generate a text, which rises to 1.7 Wh to recreate an image and 115 Wh to generate a short (6 seconds) and low quality video.

To give you an idea, charging a smartphone requires an energy expenditure of 15 Wh, which increases to 60 Wh if the device to be charged is a computer.

And this only applies to electricity consumption, to which we must add the water used to cool the servers: according to a study by the University of California, a 100-word email written by ChatGPT-4 can consume more than half a liter of water.

Should we stop being nice to AI?

The question at this point is one: is it better to stop being kind to artificial intelligence models to waste less energy? Actually no.

Logically, we would say that to avoid this waste of energy and money, it would be enough to stop asking “please” and thanking artificial intelligence. However, this hypothesis is refuted by a recent study, which demonstrated how rudeness could even worsen our interactions with artificial intelligence.

This would happen because the AI, faced with rude requests or responses, could generate imprecise and vague content, even going as far as interrupting the conversation for security reasons.

At the same time, when we try to express ourselves in a courteous and polite manner, we tend to be more accurate, processing requests in a more precise and detailed manner. In this way, it will be easier for the AI ​​to respond in a coherent and comprehensive manner and, consequently, less energy will be consumed because no further explanations will have to be generated.

In short, continuing to address artificial intelligence politely – without exceeding with formal words – could reduce energy expenditure, helping AI to provide us with more relevant answers.