Archived: AI Is Thirsty - Clive Thompson - Medium

This is a simplified archive of the page at https://clivethompson.medium.com/ai-is-thirsty-37f99f24a26e

Use this page embed on your own site:

Today I used ChatGPT to get some help making a browser plugin. I posted my queries, then watched as the code and text spilled down the screen. This is the part of large language-models that I dig! As…

ReadArchived

Each chat with a large language-model is like dumping a bottle of water on the ground

Clive Thompson

Photo by Jacek Dylag on Unsplash

Today I used ChatGPT to get some help making a browser plugin. I posted my queries, then watched as the code and text spilled down the screen. This is the part of large language-models that I dig! As a hobbyist developer, getting suggestions of customized lines of software can be a powerful way to learn.

But as it turns out, using ChatGPT consumes a lot of an unexpected resource:

Water.

The code wasn’t quite what I was looking for, so I chatted with ChatGPT for 15 minutes or so, slowly coaxing it to revise. By the time I was done, we’d gone back and forth about 20 times.

And during that exchange? Microsoft’s servers probably used about as much water as if I’d just bought a half-liter bottle … and spilled it on the ground.

AI, it turns out, is incredibly thirsty tech — ploughing through torrents of fresh water every day. Given that we’re likely to see large-language-model AI woven into ever more apps and appliances these days, it’s worth pondering just how much water our booming use of AI will consume.

Why precisely does large-language-model AI require water? Back in April, a group of researchers pondered this question as they created an estimate of AI’s water consumption. As they note in their paper (which is here free in full), the main use of water is when tech firms train their AI, and when the firms are running inferences (i.e. when you, I or anyone else interacts with the model).

Tech firms like Microsoft and Google and Meta do all that training (and inferring) on their huge computational farms. That computation requires a ton of energy, which generates heat. To remove that heat from server farms, the tech firms generally use cooling towers, where water is evaporated to send the heat out into the outside world. That evaporation? That’s how AI consumes water. It is, it’s worth noting, mostly all freshwater.

Tech firms do not publish specific stats on how much freshwater they use for different forms of computation. So the academics did some estimates. They calculated how much energy it would take to train one of the well-known large language-models (and…