
Here’s What Your ChatGPT Queries Are Costing the Environment
How did your country report this? Share your view in the comments.
Diverging Reports Breakdown
Sam Altman reveals water cost of each ChatGPT query; it will surprise you
CEO Sam Altman said a single ChatGPT query uses a few drops of water. This comes at a time when the environmental cost of artificial intelligence is under growing scrutiny. Critics point out that OpenAI has not explained how this number was calculated. Altman believes the cost of intelligence will one day drop to the price of electricity alone. That could make AI both affordable and sustainable. But for now, even a few Drops of Water per query raise big questions about AI’s long-term environmental cost, experts say. The number sounds reassuring but without knowing how the math was done, it is hard to trust fully.
CEO
shared that a single ChatGPT query uses a few drops of water. This comes at a time when the environmental cost of artificial intelligence is under growing scrutiny.
Tired of too many ads? go ad free now
In a blog post, Altman said each query consumes about 0.000085 gallons of water. That’s roughly one-fifteenth of a teaspoon. AI models like ChatGPT run on massive server farms that must be cooled constantly. This makes water usage an important part of the conversation. Altman’s claim aims to ease public concern, but some experts want more clarity and proof.
How water usage is connected to ChatGPT
AI runs on powerful computers stored in data centers that produce a lot of heat.
To keep them from overheating, companies use cooling systems that often depend on water. As tech becomes more central to daily life, water use has joined energy and carbon emissions in the sustainability debate.
Sam Altman’s water estimate and what it means
Altman said each ChatGPT query takes about 0.34 watt-hours of electricity and a few drops of water. That may sound small, but when you think about the millions of queries made each day, the total adds up. Critics point out that OpenAI has not explained how this number was calculated.
That lack of detail has made some experts cautious.
Past concerns about AI’s water use
A report from The Washington Post last year estimated that creating a 100-word email with GPT-4 could use more than a full bottle of water. These numbers were tied to the cooling needs of data centers, especially those in hot and dry places. Altman’s latest statement appears to push back on that report as pressure grows on tech firms to be more accountable.
Experts call for transparency
Many in the tech and environmental space say companies like OpenAI need to publish independent and verified data about their resource use.
Tired of too many ads? go ad free now
Altman’s number sounds reassuring but without knowing how the math was done or where the servers are located, it is hard to trust fully.
Can AI be sustainable?
As AI becomes a part of more industries and daily life, its long-term environmental cost matters more than ever. Altman believes the cost of intelligence will one day drop to the price of electricity alone. That could make AI both affordable and sustainable. But for now, even few drops of water per query raise big questions.
OpenAI CEO Sam Altman Discloses ChatGPT’s Energy and Water Usage Per Query
OpenAI CEO Sam Altman has revealed that a single ChatGPT query consumes around 0.34 watt-hours of electricity and 0.000085 gallons of water. This marks the most detailed public disclosure yet on the environmental footprint of OpenAI’s AI systems. Altman likened the energy usage to “an oven running for just over one second” or a high-efficiency lightbulb lit for a couple of minutes. The newly revealed energy and water usage figures are notably lower than previous estimates. A 2023 Washington Post report claimed that generating a 100-word email using GPT-4 could consume the equivalent of a bottle of water, though this varied by data center.
OpenAI CEO Sam Altman has revealed that a single ChatGPT query consumes around 0.34 watt-hours of electricity and 0.000085 gallons of water—roughly one-fifteenth of a teaspoon. In his blog post The Gentle Singularity, Altman likened the energy usage to “an oven running for just over one second” or a high-efficiency lightbulb lit for a couple of minutes. This marks the most detailed public disclosure yet on the environmental footprint of OpenAI’s AI systems.
also, here is one part that people not interested in the rest of the post might still be interested in: pic.twitter.com/ANDhHu9g3g — Sam Altman (@sama) June 10, 2025
Amid rising concerns about the energy and water consumption of AI systems, Sam Altman has added a revealing perspective to the discussion. “People are often curious about how much energy a ChatGPT query uses,” he noted, explaining that the average prompt consumes about 0.34 watt-hours.
ALSO SEE: Samsung Galaxy Z Fold 7 to Feature AI-Powered Triple Camera Setup
The newly revealed energy and water usage figures for ChatGPT are notably lower than previous estimates. A 2023 Washington Post report claimed that generating a 100-word email using GPT-4 could consume the equivalent of a bottle of water, though this varied by data center. Sam Altman addressed these concerns by highlighting the rapid growth in AI infrastructure, driven by increasing economic value and demand for powerful models.
wrote a new post, the gentle singularity.
realized it may be the last one like this i write with no AI help at all.
(proud to have written “From a relativistic perspective, the singularity happens bit by bit, and the merge happens slowly” the old-fashioned way) — Sam Altman (@sama) June 10, 2025
Altman also emphasized that as data centers become more automated, the cost of intelligence may eventually align closely with the cost of electricity. OpenAI’s disclosure appears aimed at increasing transparency amid rising scrutiny from policymakers and researchers over AI’s environmental impact. Earlier forecasts even suggested AI could consume more power than Bitcoin mining by the end of 2025, making such disclosures increasingly important in the global energy conversation.
Sam Altman previously revealed that users adding polite phrases like “please” and “thank you” in ChatGPT queries has cumulatively cost OpenAI tens of millions of dollars in electricity expenses. As concerns about AI’s environmental impact grow, such disclosures are expected to shape public opinion and could play a role in shaping future regulatory frameworks around AI efficiency and sustainability.
ChatGPT isn’t great for the planet. Here’s how to use AI responsibly.
The carbon cost of asking an artificial intelligence model a single text question can be measured in grams of CO2. For simple questions — such as finding a store’s hours or looking up a basic fact — you’re better off using a search engine or going directly to a trusted website. A Google search takes about 10 times less energy than a ChatGPT query, according to a 2024 analysis from Goldman Sachs. You can choose between bigger models that use more computing power to tackle complicated questions or small ones designed to give shorter, quicker answers using less power. For simpler tasks, such as reviewing a high school math assignment, a smaller model might get the job done with less energy by using a smaller, more energy-efficient model, Gudrun Socher said.“The real question isn’t: Does [AI] have impact or not? Yes, it clearly does,” Bill Tomlinson, a professor of informatics at the University of California at Irvine said.
Advertisement
That doesn’t mean you have to shun the technology entirely, according to computer scientists who study AI’s energy consumption. But you can be thoughtful about when and how you use AI chatbots.
“Use AI when it makes sense to use it. Don’t use AI for everything,” said Gudrun Socher, a computer science professor at Munich University of Applied Sciences. For basic tasks, you may not need AI — and when you do use it, you can choose to use smaller, more energy-efficient models.
When should I use AI?
For simple questions — such as finding a store’s hours or looking up a basic fact — you’re better off using a search engine or going directly to a trusted website than asking an AI model, Socher said.
A Google search takes about 10 times less energy than a ChatGPT query, according to a 2024 analysis from Goldman Sachs — although that may change as Google makes AI responses a bigger part of search. For now, a determined user can avoid prompting Google’s default AI-generated summaries by switching over to the “web” search tab, which is one of the options alongside images and news. Adding “-ai” to the end of a search query also seems to work. Other search engines, including DuckDuckGo, give you the option to turn off AI summaries.
Advertisement
If you have a thornier problem, especially one that involves summarizing, revising or translating text, then it’s worth using an AI chatbot, Socher said.
For some tasks, using AI might actually generate less CO2 than doing it yourself, according to Bill Tomlinson, a professor of informatics at the University of California at Irvine.
“The real question isn’t: Does [AI] have impact or not? Yes, it clearly does,” Tomlinson said. “The question is: What would you do instead? What are you replacing?”
An AI model can spit out a page of text or an image in seconds, while typing or digitally illustrating your own version might take an hour on your laptop. In that time, a laptop and a human worker will cause more CO2 pollution than an AI prompt, according to a 2024 paper that Tomlinson co-wrote.
Tomlinson acknowledged that there are many other reasons you might not choose to let AI write or illustrate something for you — including worries about accuracy, quality, plagiarism and so on — but he argued that it could lower emissions if you use it to save labor and laptop time.
Which AI model should I use?
Not all AI models are equal: You can choose between bigger models that use more computing power to tackle complicated questions or small ones designed to give shorter, quicker answers using less power.
Advertisement
ChatGPT, for instance, allows paying users to toggle between its default GPT-4o model, the bigger and more powerful GPT-4.5 model, and the smaller o4-mini model. Socher said the mini is good enough for most situations.
But there’s something of a trade-off between size, energy use and accuracy, according to Socher, who tested the performance of 14 AI language models from Meta, Alibaba, DeepSeek and a Silicon Valley start-up called Deep Cogito in a paper published Thursday. (Socher and her co-author, Maximilian Dauner, couldn’t test popular models such as OpenAI’s ChatGPT or Google’s Gemini because those companies don’t share their code publicly.)
Socher and Dauner asked the AI models 500 multiple-choice and 500 free-response questions on high school math, world history, international law, philosophy and abstract algebra. Bigger models gave more accurate answers but used several times more energy than smaller models.
Advertisement
If you have a request for an AI chatbot that involves grappling with complicated or theoretical concepts — such as philosophy or abstract algebra — it’s worth the energy cost to use a bigger model, Socher said. But for simpler tasks, such as reviewing a high school math assignment, a smaller model might get the job done with less energy.
No matter what model you use, you can save energy by asking the AI to be concise when you don’t need long answers — and keeping your own questions short and to the point. Models use more energy for every extra word they process.
“People often mistake these things as having some sort of sentience,” said Vijay Gadepally, a senior scientist at the MIT Lincoln Laboratory who studies ways to make AI more sustainable. “You don’t need to say ‘please’ and ‘thank you.’ It’s okay. They don’t mind.”
What about ‘passive’ AI queries?
Using AI doesn’t just mean going to a chatbot and typing in a question. You’re also using AI every time an algorithm organizes your social media feed, recommends a song or filters your spam email.
Advertisement
“We may not even realize it … because a lot of this is just hidden from us,” Gadepally said.
Every ChatGPT Question Comes at a Cost: Enough Power for a Lightbulb and a Teaspoon of Water
OpenAI CEO Sam Altman has shed more light on the environmental footprint of AI tools like ChatGPT. In a recent blog post, he revealed that an average ChatG PT query uses just 0.000085 gallons of water, which he described as ‘roughly one-fifteenth of a teaspoon’ This insight comes amidst growing concerns over how much energy and water artificial intelligence systems consume. Altman suggests that, in time, the expense of producing intelligence via AI will nearly match that of electricity. However, this perspective on electricity cost also underscores the larger environmental concerns now under close examination. According to a 2024 report from The Washington Post, creating a 100-word email with GPT-4 might use roughly one bottle of water. This demonstrates how environmental effects can differ depending on the infrastructure, such as water-based cooling. In September, Microsoft finalised a 20-year agreement with Constellation Energy to restart a dormant nuclear plant at Three Mile Island. In October, Google revealed it had partnered with Kairos Power company, to produce nuclear energy.
Yet, behind each seemingly effortless query lies a hidden environmental toll.
From the energy powering vast data centres to the surprising amount of water used for cooling, the true cost of our AI interactions extends far beyond a simple internet connection.
OpenAI CEO Sam Altman has now shed more light on the environmental footprint of AI tools like ChatGPT.
In a recent blog post, he revealed that an average ChatGPT query uses just 0.000085 gallons of water, which he described as ‘roughly one-fifteenth of a teaspoon.’
The Environmental Cost of ChatGPT
This insight comes amidst growing concerns over how much energy and water artificial intelligence systems consume. Altman’s recent blog post, while discussing the future of AI, also presented specific figures regarding ChatGPT’s resource consumption.
‘People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes,’ Altman wrote.
This was less than almost every estimate I have seem: according to the latest Sam Altman post, the average ChatGPT query uses about the same amount of power as the average Google search in 2009 (the last time they released a per-search number)… 0.0003 kWh pic.twitter.com/AgVQB7zkOu — Ethan Mollick (@emollick) June 10, 2025
‘It also uses about 0.000085 gallons of water, roughly one-fifteenth of a teaspoon,’ he added. The water usage of one-fifteenth of a teaspoon per query might appear tiny for one person. Yet, given the billions of queries AI systems handle daily, the total effect becomes substantial. OpenAI did not explain how these figures were arrived at.
‘As data centre production gets automated, the cost of intelligence should eventually converge to near the cost of electricity.’ Altman continued. However, this perspective on electricity cost also underscores the larger environmental concerns now under close examination.
Increased Focus on AI’s Environmental Burden
As AI becomes more widely adopted, experts and researchers are concerned about its environmental burden. Studies earlier this year forecast that AI could potentially use more electricity than Bitcoin mining by late 2025.
Water consumption also presents a major worry, particularly for water-based cooling data centres. According to a 2024 report from The Washington Post, creating a 100-word email with GPT-4 might use roughly one bottle of water, influenced by the data centre’s site and cooling approach. This demonstrates how environmental effects can differ depending on the infrastructure.
Altman suggests that, in time, the expense of producing intelligence via AI will nearly match that of electricity. Until then, discussions concerning AI’s environmental impact will become more vocal.
Every Drop Counts: AI’s Thirst for Water
This isn’t the only occasion Altman has foreseen AI becoming more affordable to run. In February, Altman stated in a blog post that AI’s operating expenses would reduce tenfold yearly.
‘You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period,’ Altman wrote. ‘Moore’s law changed the world at 2x every 18 months; this is unbelievably stronger,’ he added.
Big Tech’s Ambitious Energy Drive
Leading tech firms in the AI race are exploring nuclear energy as a power source for their data centres. In September, Microsoft finalised a 20-year agreement with Constellation Energy to restart a dormant nuclear plant at Three Mile Island.
Amazing story on Nuclear Power – from the Three Mile Island accident in 1979 to Microsoft in 2025 paying to restart it pic.twitter.com/UmfhyVn7n9 — Chris Fralic (@chrisfralic) May 28, 2025
In October, Google revealed it had partnered with Kairos Power, a nuclear energy company, to produce three small modular nuclear reactors. These reactors, which can supply up to 500 megawatts of electricity, are expected to be operational by 2035.
Google, Amazon, and Microsoft have signed deals for nuclear energy projects to power their AI and data capabilities. Google’s agreement with Kairos to buy power from multiple small modular reactors is a world first with 500MW planned across 6-7 reactors. pic.twitter.com/xNxQpKKkdr — Works in Progress (@WorksInProgMag) November 13, 2024
In an October interview, Google’s CEO, Sundar Pichai, told Nikkei Asia that the search giant aims for net-zero emissions across its entire operations by 2030. He further stated that Google also assessed solar energy beyond just nuclear power.
‘It was a very ambitious target, and we are still going to be working very ambitiously towards it. Obviously, the trajectory of AI investments has added to the scale of the task needed,’ Pichai said.
Hey Chat, how much do you cost the environment when you answer my questions?
The United Arab Emirates became the world’s first country to offer free access to ChatGPT Plus. The premium version is faster and more consistent than the normal version; it also can hold voice conversations, upload and analyze your files, and generate its own images for your use. About 34% of Americans rely on AI to help them accomplish some of their day-to-day activities, per polling from tech monitor Elf Sight. There are concerns about the vast amounts of natural resources sucked up by AI, depleting reservoirs and requiring additional energy. Thousands of acres near Abilene, Texas, have been earmarked for development, which is being co-developed by tech giants OpenAI, Oracle and SoftBank. The Trump administration has poured hundreds of millions of dollars of federal funding into “Stargate,” which will develop AI infrastructure for the U.S. and the UAE. The OpenAI for Countries program is fit for the environment, researchers say, but it may not be fit for them.
KEY POINTS The United Arab Emirates recently gifted ChatGPT Plus to all of its citizens, free of charge.
ChatGPT’s parent company, OpenAI, is building artificial intelligence infrastructure throughout the United States and the UAE. It also says it is fielding requests from other countries to do the same for them.
Energy advocates are sounding the alarm. AI already taxes the environment at a precipitous rate, which could mean consequences for the environment.
Earlier this week, the United Arab Emirates became the world’s first country to offer free access to ChatGPT Plus — the premium version of ChatGPT — to all its citizens. The premium version is faster and more consistent than the normal version; it also can hold voice conversations, upload and analyze your files, and generate its own images for your use.
This is just the beginning for OpenAI, ChatGPT’s parent company. OpenAI has announced intentions to partner with as many nations as possible through its “OpenAI for Countries program.”
OpenAI CEO Sam Altman has already described the UAE project as a “bold vision,” per Axios; wrapping artificial intelligence around the world would constitute an even bolder, more radical vision for a global population increasingly dependent on AI.
But can the Earth take it?
But there are concerns about the vast amounts of natural resources sucked up by AI, depleting reservoirs and requiring additional energy.
Meanwhile, politicians, business leaders and climate advocates continue to grapple over the consequences.
Traffic on Interstate 35 passes a Microsoft data center, Tuesday, Sept. 5, 2023, in West Des Moines, Iowa. Microsoft has been amassing a cluster of data centers to power its cloud computing services for more than a decade. | Charlie Neibergall, Associated Press
Texas leads the way with AI development
About 34% of Americans rely on AI to help them accomplish some of their day-to-day activities, per polling from tech monitor Elf Sight. That’s evidence of the early adoption of AI — especially because ChatGPT, which marked the beginning of the widespread AI craze, only launched in 2022.
OpenAI CEO Sam Altman became a billionaire in the following years. He was also a large donor to U.S. President Donald Trump’s 2024 presidential campaign and attended his inauguration.
The day after the inauguration, he made a public statement thanking the president for investing $500 billion into “Stargate,” which will develop AI infrastructure for the U.S.
“For (AI) to get built here, to create hundreds of thousands of jobs, to create a new industry centered here, we wouldn’t be able to do this without you, Mr. President, and I’m thrilled that we get to,” Altman said, per ABC News.
Since Inauguration Day, the Trump administration has poured hundreds of millions of dollars of federal funding into Stargate, which is being co-developed by tech giants OpenAI, Oracle and SoftBank. Thousands of acres near Abilene, Texas, have been earmarked for development, according to The Dallas Express.
There is no word yet on how Stargate might affect the state’s energy grid — which failed during natural disasters last year, leaving thousands of Texans in temporary darkness — or how it might affect the environment of a state already 41% under drought.
Nevertheless, many Texans and national leaders eagerly anticipate economic expansion. And they and the UAE (which is getting its own Stargate through its deal with OpenAI) aren’t alone in the rush to AI.
OpenAI says that, after its “unprecedented investment” in American infrastructure, they have “heard from many countries” petitioning them to integrate AI into their countries, too — meaning personalized digital servants tailored for regional dialects, government structures and social needs and customs.
The OpenAI for Countries program is fit for them. But researchers say it may not be fit for the environment.
The OpenAI logo is displayed on a cellphone in front of an image generated by ChatGPT’s Dall-E text-to-image model, Dec. 8, 2023, in Boston. | Michael Dwyer, Associated Press
What happens when you hit ‘send’ on ChatGPT
“Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud. Data centers (for AI) are present in our physical world … they have direct and indirect implications for biodiversity,” said Noman Bashir, a climate researcher at MIT.
Generative AI drinks a bottle of water per every 100-word email it writes. The electricity required by the massive machines powering programs like ChatGPT, Siri and Alexa is approaching levels equal to that of large countries like Russia, per research from MIT. ChatGPT alone daily uses enough electricity to power the Empire State Building — for a year and a half. Tremendous amounts of fossil fuels, including diesel and crude oil, go into training generative AI.
And energy needs are only multiplying. The Harvard Business Review reports that data centers, or the physical facilities that hold information and communications systems (like the 900-acre facility planned for Stargate in Texas), are responsible for 2%-3% of global greenhouse gas emissions. The volume of data across the world doubles in size every two years.
“There is still much we don’t know about the environmental impact of AI but some of the data we do have is concerning,” said Golestan Radwan, who heads a United Nations environment agency. “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”
Radwan’s agency recommends that countries begin tracking AI’s environmental impact. At the moment, most countries have few, if any, standards for AI environmental output. They also encourage countries to establish sustainability regulations around AI.
Finally, they urge tech companies to streamline their programs and begin recycling components and water.
Canny AI researchers are already at work to develop “green” AI — also known as sustainable or “net zero” AI — that could minimize the carbon footprints left by generative AI as it sprints across the globe.
But researchers also warn that green AI comes at the price of efficiency. The smarter the AI, the more energy it uses.
Earlier in May, a Republican-led tax bill proposed barring states from regulating AI for the next 10 years.
Last year, state legislatures across the country passed over 100 regulations surrounding AI; the tax bill would prevent state lawmakers from enforcing these regulations.
“We believe that excessive regulation of the AI sector could kill a transformative industry just as it’s taking off,” Vice President JD Vance told AI developers and regulators at a summit in Paris. “And I’d like to see that deregulatory flavor making a lot of the conversations this conference.”
Making AI greener: What can you do?
Researchers at the Harvard Business Review recommend ways an individual can reduce their AI-created environmental impact.
Use existing AI — don’t make your own program. Creating and training AI programs requires vast amounts of energy. There are already a myriad of AI programs available, many for free, and many specific to certain businesses or regions to cater to their personal needs.
Creating and training AI programs requires vast amounts of energy. There are already a myriad of AI programs available, many for free, and many specific to certain businesses or regions to cater to their personal needs. Use AI only when you really need it. Machine learning models are excellent at helping scientists predict natural disasters and understand diseases. They are less valuable for providing answers, especially when answers are often hallucinated. Writing emails and asking questions of ChatGPT “may be depleting the Earth’s health more than … helping its people,” say Harvard researchers.