Sam Altman clarifies ChatGPT's water usage: figures, debate, and questions surrounding AI's environmental impact

Last update: 12/06/2025

  • OpenAI CEO Sam Altman claims that each ChatGPT query uses about 0,00032 liters of water, comparing this volume to “one-fifteenth of a teaspoon.”
  • The energy consumption of an interaction with ChatGPT is around 0,34 watt-hours, similar to using an LED light bulb for a few minutes.
  • Experts and members of the scientific community point out that no clear evidence has been presented to support these figures, nor has their methodology been detailed.
  • The debate about the environmental impact of AI remains ongoing, especially regarding data center cooling and training large models.
water usage chatgpt sam altman-0

The rapid advance of artificial intelligence has brought to the table concerns about its influence on the environment, with special attention to Energy and water usage involved in running popular models like ChatGPT, developed by OpenAI. In recent months, the company's CEO, Sam Altman, has sought to shed light on the true extent of its technology's consumption of natural resources, though not without some controversy or lack of questions.

Altman's statements on his personal blog have sparked intense debate in the technological and scientific sphere.As ChatGPT's popularity continues to grow globally, public opinion and the media have focused on the ecological footprint of each query, and whether the data provided truly reflects the environmental impact that artificial intelligence can have on everyday life.

How much water does ChatGPT actually use per query?

Recently, Sam Altman stated that Every time a user interacts with ChatGPT, the associated water usage is minimal.. As he explained, A single consultation consumes around 0,00032 liters of water, roughly equivalent to "one-fifteenth of a teaspoon." This amount is primarily used in the cooling systems of data centers where servers process and generate AI responses.

Exclusive content - Click Here  The best tools to summarize texts with AI

Image on water consumption IA

Cooling is crucial to prevent overheating of electronic components, especially when we're talking about large infrastructures that run continuously and at full capacity. This need to cool machines with water isn't exclusive to ChatGPT, but is common to all the entire cloud computing and AI sector. However, the magnitude of daily queries—millions, according to OpenAI—means that even minuscule consumption accumulates an appreciable impact.

Although Altman wanted to emphasize that the cost per user is almost irrelevant, Experts and previous studies have published higher figures in independent researchFor example, recent analyses by American universities suggest that Training large models like GPT-3 or GPT-4 can require hundreds of thousands of liters of water., although the specific use per daily consultation is much lower.

The figures controversy: doubts about transparency and methodology

IA cooling systems and water use

Altman's statements have been received with caution by both the scientific community and specialized media, due to the lack of detailed explanations of how these values ​​were obtainedSeveral articles point out that OpenAI has not published the exact methodology for calculating water and energy consumption, which has led some media outlets and organizations to call for greater transparency in this area.

Exclusive content - Click Here  Copilot Search: What it is, how it works, and how to get the most out of it

Media publications such as The Washington Post, The Verge and universities such as MIT or California have pointed to higher estimates, reaching between 0,5 liters for every 20-50 consultations (in the case of previous models such as GPT-3) and several hundred thousand liters for the AI ​​training phase.

The energy debate: efficiency, context and comparisons

Another of the points addressed by Sam Altman is the Energy consumption associated with each interaction with ChatGPT. According to their estimates, An average consultation involves about 0,34 watt-hours, similar to the energy consumed by an LED light bulb in two minutes or a household oven left on for one second. To better understand the impacts of AI, you can also consult the impact of artificial intelligence on sustainability.

However, The efficiency of the models has increased in recent years And today's hardware is capable of processing requests with less power than just a couple of years ago. This means that, although individual usage is low, the challenge lies in the enormous volume of simultaneous interactions that occur on platforms like ChatGPT, Gemini, or Claude.

Recent studies support a certain reduction in average consumption per consultation, although they insist that Each browser, each device, and each region may have different figures. depending on the type of data center and the cooling system used.

The cumulative footprint and the challenge of long-term sustainability

Energy and water efficiency ChatGPT

The real dilemma arises when extrapolating these minimum numbers per consultation to the total number of daily interactions worldwide. The sum of millions of small drops can become a considerable amount of water., especially as AI is used for increasingly complex tasks and extends to sectors such as education, leisure, and healthcare.

Exclusive content - Click Here  Artificial Superintelligence (ASI): What it is, characteristics and risks

Additionally, the The training process of state-of-the-art AI models such as GPT-4 or GPT-5 continues to be extremely resource-intensive., both in terms of electricity and water, forcing technology companies to look for new energy sources—such as nuclear energy—and to consider locations for their data centers where water infrastructure is guaranteed.

La Lack of clear standards, official figures and transparency in calculations continues to fuel controversyOrganizations like EpochAI and consulting firms have attempted to estimate the impact, but there is still no consensus on the true environmental cost of interacting with generative AI on a large scale. In the meantime, the debate opens a window for reflection on the future of the technology and the environmental responsibility of its key proponents.

The discussion about the Sam Altman and AI in general highlights the tensions between technological innovation and sustainability. While the figures provided by Sam Altman seek to reassure the public about the low impact of each individual consultation, the lack of transparency and the global scale of the service keep the spotlight on the need for monitoring and scientific rigor when assessing the ecological footprint of systems that are already part of our daily lives.

Environmental Regulations in Online Order Management
Related article:
How environmental regulations can affect your online orders

Leave a comment