AI chatbots like OpenAI’s ChatGPT and Google’s Gemini have been becoming more and more prevalent in daily life, but every time someone chats with these AI tools it’s not just their curiosity that is being extinguished- it is also a stealthy drain on Earth’s water and energy. Imagine every question you ask, no matter the complexity, sets off a rapid energy-hogging process that dissipates unthinkable amounts of water. The ecological (as well as economic) costs of these digital conversations is raising many alarms, especially as reliance on AI becomes higher and higher each day.
Behind the scenes: Data centers at work
At the core of all AI chatbots is a large network of data centers- physical buildings that are equipped with thousands of extremely high-powered computers. These networks of data centers work at all hours of the day to train and run AI models, which in turn convert massive amounts of energy into useful responses for their users. These computers generate a high amount of heat that must be managed to avoid any damage or disaster. This management involves robust cooling systems that rely mostly on water. Training the GPT-3 language model in Microsoft’s US data centers can directly evaporate approximately 700,000 liters of clean freshwater.
Cooling under pressure
The cooling mechanisms in these digital powerhouses oftentimes use an extremely large amount of water. In fact, Shaolei Ren, a researcher at the University of California, stated that “OpenAI’s ChatGPT consumes 500 ml of water for every 5 to 50 prompts it answers.” While that statistic might seem insignificant on its own at first glance, recent studies show that the data centers powering AI is projected to withdraw more than six times more than the annual water withdrawal of the entire country of Denmark in the coming years.
The energy aspect
Powering the systems of AI is not solely about water, it’s also a colossal energy challenge. To give an example, Shaolei Ren found that using ChatGPT for a mere two 200-word emails uses the same amount of energy that a Tesla Model 3 would use to drive one mile. This may seem surprising to most, but when the lens is widened and the millions of interactions are taken into account, the impact on electrical grids and carbon emissions is astonishing.
Altering our digital consumption
Understanding the entire environmental footprint of AI chatbots challenges us to rethink our reliance on these tools. As the convenience of AI grows, so does the toll it takes on our water and energy sources. Once these costs are recognized, the first steps toward advocating for a more sustainable alternative to these AI models are taken.