Google defends AI energy use amid rising emission concerns

Google has released new data claiming its AI chatbot Gemini uses far less energy and water per prompt than many reports suggest.

The company says one Gemini text query uses about 0.24 watt-hours of energy, similar to watching nine seconds of television. It also emits 0.03 grams of carbon dioxide and consumes roughly five drops of water for cooling.

Google measured these figures by assessing data centre equipment, chip idle power, and cooling systems, arguing other studies focus too narrowly on machine power. The company claims its reported figures better reflect “true operating efficiency at scale.”

This comes as global energy experts warn of AI’s rising footprint. The International Energy Agency predicts AI-related energy demand could double in five years, matching Japan’s annual consumption.

Comparisons show Gemini appears more efficient than competitors. A study by the Electric Power Research Institute estimated an OpenAI ChatGPT prompt uses 2.9 watt-hours, while a Google search uses 0.3 watt-hours.

Despite efficiency claims, Google’s overall emissions have risen sharply. Its latest environmental report shows a 51 per cent increase since 2019, mainly due to hardware production linked to AI development.

The company highlights major efficiency gains over time, saying per-prompt energy use has dropped 33 times and its carbon footprint 44 times since August 2024.

However, the study does not reveal Gemini’s daily query numbers, leaving its true total environmental impact unclear. Environmental groups say this lack of transparency limits public understanding of AI’s growing energy demands.

Google insists it will continue to focus on reducing AI’s environmental cost.