AI Chatbots’ Thirst May Kill Our Water Resources

 OpenAI CEO Sam Altman has disclosed for the first time the average energy and water consumption of a single ChatGPT query, which is much less than what analysts and environmentalists had projected all this time.  

“The average query uses about 0.34 watt-hours,” Altman said in a blog post titled The Gentle Singularity. “It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon.”

For comparison, 0.34 watt-hours is equivalent to running a high-efficiency lightbulb for a couple of minutes.  

According to a study on ChatGPT’s water consumption by A. Shaji George, an expert in Information and Communications Technology (ICT), the AI chatbot consumes 0.5 litres of water during each of its lengthy conversations with a user. This applies to all AI systems and LLMs in place. Half a litre of water would mean the amount required to cook two packets of Maggi instant noodles. 

According to data provided by Semrush, ChatGPT was ranked as the eighth most visited site in the world as of March 2025, with approximately 5.56 billion visits. 

Meanwhile, research shows that every question potentially uses around 10 times more electricity than a simple Google search, with an average of 2.9 watt-hours of energy. However, Altman’s estimate is nearly one-tenth of what analysts projected. 

Previously, he shared that polite courtesies with ChatGPT, such as “please” and “thank you,” have cost the company tens of millions of dollars in electricity expenses.

This revelation comes amid increasing scrutiny of AI’s resource usage as models grow in size and adoption. 

Altman also laid out a longer-term forecast on AI cost trends, stating, “As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity.”

Environmental analysts note that while per-query usage is low, aggregate consumption remains a concern. With hundreds of millions of queries each day, energy and water use from AI data centres could grow substantially. 

In the same post, Altman acknowledged these broader implications, saying, “The economic value creation has started a flywheel of compounding infrastructure buildout to run these increasingly-powerful AI systems.”

The release of these statistics appears to be part of OpenAI’s broader attempt to improve transparency about its AI infrastructure. As public and regulatory interest in AI’s environmental impact grows, such disclosures are likely to play a larger role in shaping both company policy and public perception.

The post Each ChatGPT Query Uses Merely a 15th of a Teaspoon of Water, Says Sam Altman  appeared first on Analytics India Magazine.