Electricity use of AI coding agents | Simon P. Couch – Simon P. Couch
January 22, 2026
Throughout 2025, we got better estimates of electricity and water use of AI chatbots. There are all sorts of posts I could cite on this topic, but a favorite is this blog post from Our World in Data’s Hannah Ritchie. On the electricity front:
The average American uses 1600 liters of water per day, so even if you make 100 prompts per day, at 2ml per prompt, that’s only 0.01% of your total water consumption. Using a shower for one second would use far more.
Generally, these analyses guide my own thinking about the environmental impacts of my individual usage of LLMs; if I’m interested in reducing my personal carbon footprint, I’m much better off driving a couple miles less a week or avoiding one flight each year. This is indeed the right conclusion for users of chat interfaces like chatgpt.com or claude.ai.
For a while now, there have been very serious concerns about the energy and water use impacts of large language models. Given we face a genuine climate crisis, it’s certainly important to consider this issue. But we’ve seen very few and certainly very few disinterested studies of this issue.
This suggests that the concerns are somewhat overstated. Although it must be noted that this has to do with inference and that training, at least for now, seems to consume the majority of energy associated with large language models.







