
Here’s what your ChatGPT queries are costing the environment
How did your country report this? Share your view in the comments.
Diverging Reports Breakdown
Your A.I. Queries Come With a Climate Cost
Generative A.I. is quickly becoming part of daily life as tech giants race to develop the most advanced models. Some chatbots are linked to more greenhouse gas emissions than others. Researchers found that chatbots with bigger “brains” used exponentially more energy.
All those prompts come with an environmental cost: A report last year from the Energy Department found A.I. could help increase the portion of the nation’s electricity supply consumed by data centers from 4.4 percent to 12 percent by 2028. To meet this demand, some power plants are expected to burn more coal and natural gas.
And some chatbots are linked to more greenhouse gas emissions than others. A study published Thursday in the journal Frontiers in Communication analyzed different generative A.I. chatbots’ capabilities and the planet-warming emissions generated from running them. Researchers found that chatbots with bigger “brains” used exponentially more energy and also answered questions more accurately — up until a point.
“We don’t always need the biggest, most heavily trained model, to answer simple questions. Smaller models are also capable of doing specific things well,” said Maximilian Dauner, a Ph.D. student at the Munich University of Applied Sciences and lead author of the paper. “The goal should be to pick the right model for the right task.”
ChatGPT isn’t great for the planet. Here’s how to use AI responsibly.
The carbon cost of asking an artificial intelligence model a single text question can be measured in grams of CO2. For simple questions — such as finding a store’s hours or looking up a basic fact — you’re better off using a search engine or going directly to a trusted website. A Google search takes about 10 times less energy than a ChatGPT query, according to a 2024 analysis from Goldman Sachs. You can choose between bigger models that use more computing power to tackle complicated questions or small ones designed to give shorter, quicker answers using less power. For simpler tasks, such as reviewing a high school math assignment, a smaller model might get the job done with less energy by using a smaller, more energy-efficient model, Gudrun Socher said.“The real question isn’t: Does [AI] have impact or not? Yes, it clearly does,” Bill Tomlinson, a professor of informatics at the University of California at Irvine said.
Advertisement
That doesn’t mean you have to shun the technology entirely, according to computer scientists who study AI’s energy consumption. But you can be thoughtful about when and how you use AI chatbots.
“Use AI when it makes sense to use it. Don’t use AI for everything,” said Gudrun Socher, a computer science professor at Munich University of Applied Sciences. For basic tasks, you may not need AI — and when you do use it, you can choose to use smaller, more energy-efficient models.
When should I use AI?
For simple questions — such as finding a store’s hours or looking up a basic fact — you’re better off using a search engine or going directly to a trusted website than asking an AI model, Socher said.
A Google search takes about 10 times less energy than a ChatGPT query, according to a 2024 analysis from Goldman Sachs — although that may change as Google makes AI responses a bigger part of search. For now, a determined user can avoid prompting Google’s default AI-generated summaries by switching over to the “web” search tab, which is one of the options alongside images and news. Adding “-ai” to the end of a search query also seems to work. Other search engines, including DuckDuckGo, give you the option to turn off AI summaries.
Advertisement
If you have a thornier problem, especially one that involves summarizing, revising or translating text, then it’s worth using an AI chatbot, Socher said.
For some tasks, using AI might actually generate less CO2 than doing it yourself, according to Bill Tomlinson, a professor of informatics at the University of California at Irvine.
“The real question isn’t: Does [AI] have impact or not? Yes, it clearly does,” Tomlinson said. “The question is: What would you do instead? What are you replacing?”
An AI model can spit out a page of text or an image in seconds, while typing or digitally illustrating your own version might take an hour on your laptop. In that time, a laptop and a human worker will cause more CO2 pollution than an AI prompt, according to a 2024 paper that Tomlinson co-wrote.
Tomlinson acknowledged that there are many other reasons you might not choose to let AI write or illustrate something for you — including worries about accuracy, quality, plagiarism and so on — but he argued that it could lower emissions if you use it to save labor and laptop time.
Which AI model should I use?
Not all AI models are equal: You can choose between bigger models that use more computing power to tackle complicated questions or small ones designed to give shorter, quicker answers using less power.
Advertisement
ChatGPT, for instance, allows paying users to toggle between its default GPT-4o model, the bigger and more powerful GPT-4.5 model, and the smaller o4-mini model. Socher said the mini is good enough for most situations.
But there’s something of a trade-off between size, energy use and accuracy, according to Socher, who tested the performance of 14 AI language models from Meta, Alibaba, DeepSeek and a Silicon Valley start-up called Deep Cogito in a paper published Thursday. (Socher and her co-author, Maximilian Dauner, couldn’t test popular models such as OpenAI’s ChatGPT or Google’s Gemini because those companies don’t share their code publicly.)
Socher and Dauner asked the AI models 500 multiple-choice and 500 free-response questions on high school math, world history, international law, philosophy and abstract algebra. Bigger models gave more accurate answers but used several times more energy than smaller models.
Advertisement
If you have a request for an AI chatbot that involves grappling with complicated or theoretical concepts — such as philosophy or abstract algebra — it’s worth the energy cost to use a bigger model, Socher said. But for simpler tasks, such as reviewing a high school math assignment, a smaller model might get the job done with less energy.
No matter what model you use, you can save energy by asking the AI to be concise when you don’t need long answers — and keeping your own questions short and to the point. Models use more energy for every extra word they process.
“People often mistake these things as having some sort of sentience,” said Vijay Gadepally, a senior scientist at the MIT Lincoln Laboratory who studies ways to make AI more sustainable. “You don’t need to say ‘please’ and ‘thank you.’ It’s okay. They don’t mind.”
What about ‘passive’ AI queries?
Using AI doesn’t just mean going to a chatbot and typing in a question. You’re also using AI every time an algorithm organizes your social media feed, recommends a song or filters your spam email.
Advertisement
“We may not even realize it … because a lot of this is just hidden from us,” Gadepally said.
This is how much energy a single search query consumes on ChatGPT, reveals CEO Sam Altman
OpenAI CEO Sam Altman has revealed that an average ChatGPT query uses approximately 0.34 watt-hours of electricity and 0.000085 gallons of water. Altman compared the energy use to that of “an oven running for just over one second” or “a high-efficiency lightbulb operating for a couple of minutes” The disclosure is the most detailed public insight so far into the environmental impact of the company’s AI systems.
In a blog post titled The Gentle Singularity, Altman compared the energy use to that of “an oven running for just over one second” or “a high-efficiency lightbulb operating for a couple of minutes.” The disclosure is the most detailed public insight so far into the environmental impact of the company’s AI systems.
Advertisement
The figures are significantly lower than some earlier projections. A 2023 report by The Washington Post found that generating a 100-word email using GPT-4 could use “a little more than one bottle” of water, with usage varying depending on the datacenter location.
Altman also addressed the growing scrutiny around the sustainability of AI systems, particularly as models increase in size and usage. He wrote, “The economic value creation has started a flywheel of compounding infrastructure buildout to run these increasingly powerful AI systems.”
He further suggested that as datacenter operations become more automated, “the cost of intelligence should eventually converge to near the cost of electricity.”
The release of these statistics appears to be part of OpenAI’s broader efforts to promote transparency around its infrastructure and environmental footprint, especially as policymakers and researchers increasingly raise concerns about AI’s long-term energy demands. Earlier this year, some researchers predicted that AI could surpass Bitcoin mining in power consumption by the end of 2025.
Advertisement
Altman had previously shared that the politeness of users, such as saying “please” and “thank you” in queries, has cost the company tens of millions of dollars in electricity expenses over time.
As the environmental implications of AI continue to draw attention, disclosures like these are likely to influence both public perception and future regulation.
Want To Save The Planet? Stop Using AI — Here’s Why
New research from Germany reveals the shocking environmental cost of our AI conversations. The most advanced AI models can emit over 2,000 grams of CO2 equivalent to answer just 500 questions. When AI models were allowed to “think out loud,” showing their work like a student solving a math problem, their energy consumption skyrocketed. The top performer, a 70-billion parameter reasoning model called Cogito, achieved 84.9% accuracy but emitted 1,341 grams ofCO2 equivalent, nearly 50 times more than the smallest model tested. The research revealed some models strike a balance between environmental impact and strong performance and 77% accuracy. The more powerful the model, the higher the environmental cost, but not all big models are equal: some, like Qwen 2.5, achieved strong performance with far lower emissions than similarly sized systems. This suggests that AI companies could potentially design models that are both smart and environmentally conscious, though it may require some sacrifice of the advanced capabilities that make them smart.
In a nutshell Advanced AI models produce significantly more carbon emissions, especially when they use reasoning to generate long, complex responses, sometimes emitting over 2,000 grams of CO₂ equivalent to answer just 1,000 questions.
The more powerful the model, the higher the environmental cost, but not all big models are equal: some, like Qwen 2.5, achieved strong performance with far lower emissions than similarly sized systems.
Despite AI’s growing energy demands, only a tiny fraction of research papers mention carbon emissions, highlighting a major blind spot in how we evaluate and design AI systems.
MUNICH — Every time you ask ChatGPT to write an email or have Claude solve a math problem, you’re contributing to a growing carbon footprint. One recent estimate suggests generative AI models now use as much electricity annually as entire countries. New research from Germany reveals the shocking environmental cost of our AI conversations. The numbers might make you think twice about your next chat with a bot.
The study, published in Frontiers in Communication, focused on how much energy large language models actually consume when we use them. According to researchers, the most advanced AI models can emit over 2,000 grams of CO2 equivalent to answer just 500 questions.
While we’ve all heard vague warnings about AI’s environmental impact, this study actually measured energy consumption in real time as different AI models worked through problems.
Researchers tested 14 different AI models, ranging from relatively small 7-billion-parameter models to massive 72-billion-parameter systems. The more parameters, the “smarter” the AI, but also the more energy-hungry it is.
ChatGPT and other AI systems are becoming as common as search engines for answering everyday questions. (Tada Images/Shutterstock)
Each model tackled 1,000 questions total: 500 multiple-choice questions where they just had to pick A, B, C, or D, and 500 free-response questions where they could write lengthy answers. The questions came from diverse subjects including philosophy, world history, international law, abstract algebra, and high school mathematics.
How AI “Thinks” Matters
Using an NVIDIA A100 GPU, the kind of powerful computer chip that powers most AI services, the team measured exactly how much electricity each model consumed and converted that into CO2 emissions using global energy grid averages.
Larger models consistently performed better on the tests, but they also consumed dramatically more energy. The top performer, a 70-billion parameter reasoning model called Cogito, achieved 84.9% accuracy but emitted 1,341 grams of CO2 equivalent, nearly 50 times more than the smallest model tested.
There was also a difference between regular AI responses and “reasoning” responses. When AI models were allowed to “think out loud,” showing their work like a student solving a math problem, their energy consumption skyrocketed.
The study found that reasoning-enabled systems generated substantially more emissions than their standard counterparts. In some cases, reasoning modes consumed 4 to 6 times more energy than standard text generation.
When AI Gets Chatty, the Planet Pays
Part of the problem is that advanced AI models can’t seem to keep their answers short. When asked simple multiple-choice questions that should require just a one-letter answer, some reasoning models generated responses with over 14,000 words. One model produced a single answer that was 37,575 words long, longer than many novellas.
This verbosity comes with a cost. The study tracked “tokens,” which are units of text that AI models process and generate. While basic models might use 37 tokens (roughly 30 words) to answer a question, reasoning models averaged over 1,400 tokens per response, with some stretching into the thousands.
Different subjects also demanded varying amounts of computational power. Abstract algebra consistently stumped the models and required the most energy, while questions about world history were relatively easier for AI to handle efficiently.
Finding the Sweet Spot
Not all the findings were concerning. The research revealed that some models strike a balance between performance and environmental impact. The Qwen 2.5 model with 72 billion parameters achieved strong 77.6% accuracy while emitting just 427 grams of CO2 equivalent, less than one-third the emissions of comparable reasoning models.
This suggests that AI companies could potentially design models that are both smart and environmentally conscious, though it may require sacrificing some of the advanced reasoning capabilities that make headlines.
As AI grows more advanced, it may contribute to more carbon emissions and harm to the planet. (© Tierney
– stock.adobe.com)
With AI chatbots becoming as common as search engines, these energy costs add up quickly. The study notes that generative AI models already consume about 29.3 terawatt-hours annually, equivalent to Ireland’s entire national electricity consumption.
Yet despite growing awareness of climate change, the researchers found that only about 2% of AI research papers even mention carbon emissions or environmental impact. Most studies rely on theoretical estimates rather than real-world measurements like this one.
The research points to several potential solutions for reducing AI’s environmental footprint. Companies could focus on optimizing reasoning efficiency rather than simply maximizing model size and capabilities. The study suggests that developing more efficient reasoning strategies could maintain high accuracy while reducing emissions.
The wide variation in performance across different subject areas also indicates that specialized models designed for specific tasks might be more environmentally friendly than general-purpose reasoning systems.
“If users know the exact CO₂ cost of their AI-generated outputs, such as casually turning themselves into an action figure, they might be more selective and thoughtful about when and how they use these technologies,” says study author Maximilian Dauner from the Munich University of Applied Sciences, in a statement.
How much environmental cost are we willing to accept for smarter artificial intelligence? This research suggests that our current trajectory toward more powerful, reasoning-capable AI comes with steep environmental trade-offs that most users never see.
Hey Chat, how much do you cost the environment when you answer my questions?
The United Arab Emirates became the world’s first country to offer free access to ChatGPT Plus. The premium version is faster and more consistent than the normal version; it also can hold voice conversations, upload and analyze your files, and generate its own images for your use. About 34% of Americans rely on AI to help them accomplish some of their day-to-day activities, per polling from tech monitor Elf Sight. There are concerns about the vast amounts of natural resources sucked up by AI, depleting reservoirs and requiring additional energy. Thousands of acres near Abilene, Texas, have been earmarked for development, which is being co-developed by tech giants OpenAI, Oracle and SoftBank. The Trump administration has poured hundreds of millions of dollars of federal funding into “Stargate,” which will develop AI infrastructure for the U.S. and the UAE. The OpenAI for Countries program is fit for the environment, researchers say, but it may not be fit for them.
KEY POINTS The United Arab Emirates recently gifted ChatGPT Plus to all of its citizens, free of charge.
ChatGPT’s parent company, OpenAI, is building artificial intelligence infrastructure throughout the United States and the UAE. It also says it is fielding requests from other countries to do the same for them.
Energy advocates are sounding the alarm. AI already taxes the environment at a precipitous rate, which could mean consequences for the environment.
Earlier this week, the United Arab Emirates became the world’s first country to offer free access to ChatGPT Plus — the premium version of ChatGPT — to all its citizens. The premium version is faster and more consistent than the normal version; it also can hold voice conversations, upload and analyze your files, and generate its own images for your use.
This is just the beginning for OpenAI, ChatGPT’s parent company. OpenAI has announced intentions to partner with as many nations as possible through its “OpenAI for Countries program.”
OpenAI CEO Sam Altman has already described the UAE project as a “bold vision,” per Axios; wrapping artificial intelligence around the world would constitute an even bolder, more radical vision for a global population increasingly dependent on AI.
But can the Earth take it?
But there are concerns about the vast amounts of natural resources sucked up by AI, depleting reservoirs and requiring additional energy.
Meanwhile, politicians, business leaders and climate advocates continue to grapple over the consequences.
Traffic on Interstate 35 passes a Microsoft data center, Tuesday, Sept. 5, 2023, in West Des Moines, Iowa. Microsoft has been amassing a cluster of data centers to power its cloud computing services for more than a decade. | Charlie Neibergall, Associated Press
Texas leads the way with AI development
About 34% of Americans rely on AI to help them accomplish some of their day-to-day activities, per polling from tech monitor Elf Sight. That’s evidence of the early adoption of AI — especially because ChatGPT, which marked the beginning of the widespread AI craze, only launched in 2022.
OpenAI CEO Sam Altman became a billionaire in the following years. He was also a large donor to U.S. President Donald Trump’s 2024 presidential campaign and attended his inauguration.
The day after the inauguration, he made a public statement thanking the president for investing $500 billion into “Stargate,” which will develop AI infrastructure for the U.S.
“For (AI) to get built here, to create hundreds of thousands of jobs, to create a new industry centered here, we wouldn’t be able to do this without you, Mr. President, and I’m thrilled that we get to,” Altman said, per ABC News.
Since Inauguration Day, the Trump administration has poured hundreds of millions of dollars of federal funding into Stargate, which is being co-developed by tech giants OpenAI, Oracle and SoftBank. Thousands of acres near Abilene, Texas, have been earmarked for development, according to The Dallas Express.
There is no word yet on how Stargate might affect the state’s energy grid — which failed during natural disasters last year, leaving thousands of Texans in temporary darkness — or how it might affect the environment of a state already 41% under drought.
Nevertheless, many Texans and national leaders eagerly anticipate economic expansion. And they and the UAE (which is getting its own Stargate through its deal with OpenAI) aren’t alone in the rush to AI.
OpenAI says that, after its “unprecedented investment” in American infrastructure, they have “heard from many countries” petitioning them to integrate AI into their countries, too — meaning personalized digital servants tailored for regional dialects, government structures and social needs and customs.
The OpenAI for Countries program is fit for them. But researchers say it may not be fit for the environment.
The OpenAI logo is displayed on a cellphone in front of an image generated by ChatGPT’s Dall-E text-to-image model, Dec. 8, 2023, in Boston. | Michael Dwyer, Associated Press
What happens when you hit ‘send’ on ChatGPT
“Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud. Data centers (for AI) are present in our physical world … they have direct and indirect implications for biodiversity,” said Noman Bashir, a climate researcher at MIT.
Generative AI drinks a bottle of water per every 100-word email it writes. The electricity required by the massive machines powering programs like ChatGPT, Siri and Alexa is approaching levels equal to that of large countries like Russia, per research from MIT. ChatGPT alone daily uses enough electricity to power the Empire State Building — for a year and a half. Tremendous amounts of fossil fuels, including diesel and crude oil, go into training generative AI.
And energy needs are only multiplying. The Harvard Business Review reports that data centers, or the physical facilities that hold information and communications systems (like the 900-acre facility planned for Stargate in Texas), are responsible for 2%-3% of global greenhouse gas emissions. The volume of data across the world doubles in size every two years.
“There is still much we don’t know about the environmental impact of AI but some of the data we do have is concerning,” said Golestan Radwan, who heads a United Nations environment agency. “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”
Radwan’s agency recommends that countries begin tracking AI’s environmental impact. At the moment, most countries have few, if any, standards for AI environmental output. They also encourage countries to establish sustainability regulations around AI.
Finally, they urge tech companies to streamline their programs and begin recycling components and water.
Canny AI researchers are already at work to develop “green” AI — also known as sustainable or “net zero” AI — that could minimize the carbon footprints left by generative AI as it sprints across the globe.
But researchers also warn that green AI comes at the price of efficiency. The smarter the AI, the more energy it uses.
Earlier in May, a Republican-led tax bill proposed barring states from regulating AI for the next 10 years.
Last year, state legislatures across the country passed over 100 regulations surrounding AI; the tax bill would prevent state lawmakers from enforcing these regulations.
“We believe that excessive regulation of the AI sector could kill a transformative industry just as it’s taking off,” Vice President JD Vance told AI developers and regulators at a summit in Paris. “And I’d like to see that deregulatory flavor making a lot of the conversations this conference.”
Making AI greener: What can you do?
Researchers at the Harvard Business Review recommend ways an individual can reduce their AI-created environmental impact.
Use existing AI — don’t make your own program. Creating and training AI programs requires vast amounts of energy. There are already a myriad of AI programs available, many for free, and many specific to certain businesses or regions to cater to their personal needs.
Creating and training AI programs requires vast amounts of energy. There are already a myriad of AI programs available, many for free, and many specific to certain businesses or regions to cater to their personal needs. Use AI only when you really need it. Machine learning models are excellent at helping scientists predict natural disasters and understand diseases. They are less valuable for providing answers, especially when answers are often hallucinated. Writing emails and asking questions of ChatGPT “may be depleting the Earth’s health more than … helping its people,” say Harvard researchers.