Dev.to•Jan 18, 2026, 11:37 PM
OpenAI's ChatGPT Uses Two Cups of Water Per Query, Devs Suddenly Rediscover Claude Haiku Exists

OpenAI's ChatGPT Uses Two Cups of Water Per Query, Devs Suddenly Rediscover Claude Haiku Exists

A recent investigation into the environmental impact of artificial intelligence has revealed significant energy and water consumption associated with AI queries. A single ChatGPT query uses approximately 0.3 watt-hours of electricity, 10 times more than a Google search, with over a billion AI queries occurring daily. Additionally, data center cooling requires substantial water usage, with large centers consuming 3-5 million gallons daily, equivalent to 5-8 Olympic swimming pools. The US has only 23,000 permanent data center jobs, despite consuming over 4% of the country's electricity. OpenAI's Stargate project in Texas, for example, created 1,500 construction jobs but only 100 permanent positions, subsidized by taxpayers at $1.95 million per job. However, researchers have found that strategic model selection, prompt engineering, and caching can reduce AI's environmental footprint by 50-90%, highlighting the importance of informed development choices in the industry.

Viral Score: 87%

More Roasted Feeds

No news articles yet. Click "Fetch Latest" to get started!