ZDNET's key takeaways
- Google released energy and water consumption estimates for its Gemini AI apps.
- It is the first major tech company to publish this information.
- Estimates are lower than public calculations, but industry-wide usage is still unclear.
AI demand is rapidly accelerating, which means the infrastructure that makes it possible -- data centers and the power plants that supply them -- is expanding, too. The lack of concrete data around exactly how much energy AI uses has created concern and debate about how that demand is impacting the environment. New data from Google hopes to change that.
Also: How much energy does AI really use? The answer is surprising - and a little complicated
In an industry first, the company published estimates on its Gemini chatbot's energy usage and emissions. The average Gemini text prompt uses "0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water," Google said Thursday, comparing the per-prompt impact to "watching TV for less than nine seconds."
Of course, that's just one average prompt. Google estimated Gemini had 350 million monthly users in March (almost half of ChatGPT user estimates); depending on how many are querying Gemini at any given moment, what enterprise clients are using the chatbot for, and power users sending more complex prompts, those seconds can add up.
Google published a framework for tracking the emissions, energy, and water use of its Gemini apps, saying its findings are "substantially lower than many public estimates" of the resources AI consumes. For more details on the findings, keep reading.
A first-of-its-kind report
Google started publishing information on its global data center electricity usage in 2020, and provides annual reports on the Power Usage Effectiveness (PUE) of its data centers going back to 2008. Though Google did not publish its raw AI energy data, it is the first tech company to release granular reporting on the subject.
Also: How web scraping actually works - and why AI changes everything
In June, after alarming claims about how resource- and water-intensive ChatGPT use is circulated on social media, OpenAI CEO Sam Altman wrote in a blog that the average ChatGPT query uses "about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes." He added that a query uses "roughly one fifteenth of a teaspoon" of water, but did not provide methodology or data to support either statement.
While reporting indicates Meta's data centers are using huge amounts of water, none of AI's major players themselves, including Anthropic, have shared specifics.
Usage appears lower than expected
According to Google, some AI resource calculations "only include active machine consumption" or focus solely on the inference cost of models, ignoring crucial factors that can make an AI system function more efficiently, and therefore with a smaller footprint. For example, larger reasoning models need more compute than smaller ones; to improve efficiency, approaches like speculative decoding (which Google uses) let fewer chips address more queries by having a smaller model make predictions that a larger model then verifies, as opposed to the larger model handling the entire process.
In response, Google developed its own methodology for the report, taking into account several components that it said are often overlooked. In its testing, Google said it tracked not just the energy and water used by the model actively computing, but how chips are actually used at scale, which it said "can be much lower than theoretical maximums."
Also: 30% of Americans are now active AI users, says new ComScore data
The company monitored energy used beyond the TPUs and GPUs that AI runs on, factoring in host CPU and RAM as well, to ensure all components that contribute to an AI query were accounted for. It also included the energy used by "idle machines," or systems that must be on standby even when not actively computing to handle usage spikes, alongside infrastructure that's always in use, even for non-AI computation, like data center overhead, cooling systems, and water consumption.
Google said it compared a "non-comprehensive" approach to its own: the former estimated that "the median Gemini text prompt uses 0.10 Wh of energy, emits 0.02 gCO2e, and consumes 0.12 mL of water" -- numbers Google said "substantially" underestimated Gemini's footprint and were "optimistic" at best.
Its own methodology, on the other hand, showed higher estimates: 0.24 Wh, 0.03 gCO2e, and 0.26 mL of water comparatively. "We believe this is the most complete view of AI's overall footprint," Google said.
Also: Cisco reveals its AI-ready data center strategy - boosted by its Nvidia partnership
Despite revealing higher numbers, Google still said AI energy usage has been overhyped.
"The energy consumption, carbon emissions, and water consumption were actually a lot lower than what we've been seeing in some of the public estimates," said Savannah Goodman, head of Google's advanced energy labs, in a video shared with ZDNET. Goodman did not cite specific estimates for comparison.
The company said that "over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses." However, Google added that neither the data nor the claims had been vetted by a third party.
Google's 'full-stack' sustainability future
Google cited several approaches it's implementing in data centers to improve efficiency overall, which it says will decrease its AI emissions footprint. These include maximizing hardware performance, using hybrid reasoning, and distillation, or having larger models teach smaller ones. Google also reiterated commitments to using clean energy sources and replenishing the freshwater it uses for cooling.
Also: Stop using AI for these 9 work tasks - here's why
While the company's data center emissions might be down 12%, its latest sustainability report, released in June, showed Google's energy usage has more than doubled in just four years. Its data pertaining to Gemini appears less alarming than many other AI usage estimates out there, but that shouldn't be treated as evidence that Google is below energy usage norms for the tech industry, or is making larger-scale cuts -- especially given how popular Gemini is with users on a daily basis.
Why it matters
As AI expands, energy efficiency has been top of mind for many -- but growth is happening too fast for environmental concerns to land. Reporting indicates AI demands are driving electricity and related resource use, which makes it an important component of our environmental future. A recent Reuters/Ipsos poll showed 61% of Americans are concerned about AI electricity use.
Last month, President Trump pledged $92 billion toward AI infrastructure in Pennsylvania, an extension of the $500 billion Stargate initiative he announced shortly after taking office in January, alongside several companies, including OpenAI. The Trump administration's AI Action Plan, released last month, clarified intentions to "reject radical climate dogma," reduce regulations, and "expedite environmental permitting" for new data centers and power plants.
Also: How the Trump administration changed AI: A timeline
That said, if applied correctly, AI could also help curb emissions and create sustainable energy futures that could mitigate the impact of climate change.
The more data the public has on AI's impact, the better it can advocate for sustainable applications. More metric sharing -- especially when company data finally gets vetted by independent third parties -- could create industry standards and competitive incentives for users and businesses to take emissions and energy use into account when selecting a model. Ideally, Google's report incentivizes other companies to share similar information on their own AI systems.
While Google's numbers might give individual users some relief that their handful of queries isn't using an entire bottle of potable water, they can't be considered in a vacuum. As AI use goes up, these numbers will only continue to compound, unless data center infrastructure invests seriously in renewable energy sources -- a process experts say could be deprioritized given the rapid pace of the industry and the Trump administration's priorities.