A single hyperscale data center can consume up to 5 million gallons of drinking water per day [WHYY]. That’s roughly what a town of 50,000 people uses. A February 2026 Wharton panel sharpened the picture: half the industry is now expanding into water-constrained regions, even as facilities hit that 5-million-gallon ceiling.
The sustainability conversation around AI has focused almost entirely on energy: kilowatt-hours, carbon offsets, renewable procurement. Water barely registers. That’s a problem. You can generate more electricity, but you can’t manufacture more freshwater. The numbers surfacing in early 2026 make the case impossible to ignore.
The Hidden Water Crisis Behind Your Screen
Most developers think about compute costs in terms of GPU hours and electricity bills.
Water doesn’t show up in your AWS invoice, which is partly why nobody talks about it. The scale, though, is staggering.
In 2023, U.S. data centers consumed an estimated 66 billion litres of water just for operations [TNFD]. Google alone reported withdrawing 37 billion litres that same year, with 80% consumed through evaporation [IIGCC], meaning it never returns to the local water table. That water is simply gone.
The growth trajectory is more alarming. On-site water consumption for data center cooling is projected to grow from 22 billion gallons to 34 billion gallons by 2030 [Bluefield]. Some projections put annual onsite use at two to four times 2023 levels by 2028, rising to roughly 150 to 280 billion litres [Airsys].
Unlike power consumption, water usage carries no public reporting mandate in most jurisdictions. Only a handful of companies voluntarily disclose Water Usage Effectiveness (WUE) scores. The rest are effectively untracked. You can’t optimize what you don’t measure, and right now the industry isn’t measuring.
How Much Water Data Centers Actually Use
Real numbers help here. An average hyperscale data center running at roughly 100 MW in the U.S. may consume around 528,000 gallons of water per day [Airsys]. That’s the average. The largest facilities running the heaviest AI workloads can hit 5 million gallons daily [WHYY][EcoFlow].
“Hyperscale data centers have the potential to be among the largest consumers of water in the basin, especially when considered as a whole. Millions of gallons a day is more water than is used by most communities.” [WHYY]
Cloud providers market “sustainable infrastructure.” Reality is closer to outdrinking entire counties. Training large AI models is particularly water-intensive because sustained, high-density compute loads push cooling systems to their limits for weeks or months. Inference workloads, meaning every ChatGPT query and every Copilot suggestion, compound the demand continuously.
Key comparisons that ground the scale:
-
Average hyperscale facility: roughly 528,000 gallons per day [Airsys]
-
Largest hyperscale facilities: up to 5 million gallons per day
-
Typical U.S. household: roughly 300 gallons per day
-
Small town of 50,000: roughly 5 million gallons per day
A single campus can match or exceed the water draw of a small town. The 100x comparison to smaller communities isn’t hyperbole. It’s arithmetic.
Why Cooling Systems Are So Water-Hungry
Here’s the engineering tradeoff at the core of this problem.
Modern GPUs, including the H100s and B200s powering AI workloads, can each dissipate 300 or more watts of heat. Pack thousands of them into a facility and you’re generating heat equivalent to a small industrial furnace running 24/7.
Evaporative cooling towers are the dominant solution because they’re cheap and energy-efficient. Water gets sprayed across hot surfaces, absorbs heat, and evaporates. The physics work well. The catch: that water is consumed, not recycled. It turns into vapor and leaves the system. Google’s own numbers confirm this: 80% of their withdrawn water is lost to evaporation [IIGCC].
The alternative, air-cooled systems, uses less water but demands significantly more electricity, which means higher carbon emissions and bigger power bills. Operators have historically chosen the water-intensive path because electricity costs are visible and water costs are not. It’s a classic externality problem: the environmental cost gets pushed onto the local watershed while the operator saves on their energy line item.
Regions Feeling the Strain Most Acutely
The Wharton panel’s finding that 50% of industry expansion is happening in water-constrained regions isn’t surprising if you follow the siting patterns.
Arizona, Nevada, and Northern Virginia, already under water stress, host some of the densest data center clusters on the planet. Phoenix keeps adding major facilities despite sitting in the middle of a multi-decade drought.
The conflict is no longer theoretical. Local governments and agricultural communities are pushing back:
-
The Netherlands imposed restrictions on new data center construction near Amsterdam, citing water and energy stress
-
Several U.S. municipalities have explored or enacted moratoriums on new data center permits
-
Farmers in drought-prone regions are increasingly vocal about competing for the same aquifer
These regions often attract data centers with tax incentives and cheap land, then discover the water bill gets paid by everyone downstream. When a facility consumes millions of gallons daily from a shared basin, the cost isn’t on the operator’s balance sheet. It’s on the community’s.
Solutions Already Reducing Water Consumption
Credible alternatives exist, and some are already deployed at scale.
Direct liquid cooling (DLC) circulates coolant directly to chip packages, bypassing evaporative towers entirely. Industry benchmarks suggest DLC can reduce water usage by up to 90% compared to traditional evaporative systems. Retrofitting existing facilities is expensive, and DLC requires different rack designs, plumbing, and maintenance expertise. For new builds, though, the economics are increasingly favorable, especially as GPU power density keeps climbing.
Geographic siting is the other lever. Meta’s data center in Luleå, Sweden, uses outside air for cooling nearly year-round, achieving near-zero water consumption for cooling. Iceland, northern Canada, and Scandinavian countries offer similar advantages. The tradeoff is latency: shipping packets from Luleå to New York adds milliseconds, which matters for real-time applications but is irrelevant for training runs and batch processing.
A practical path forward for the industry:
- Deploy DLC in new facilities as the default, not the exception
- Route training and batch workloads to cold-climate facilities where water-free cooling is viable
- Mandate public WUE reporting, because you can’t benchmark what isn’t disclosed
- Factor local water stress into siting decisions with the same rigor applied to power availability
Data centers are consuming water at a scale that rivals entire towns, driven by evaporative cooling and the relentless growth of AI workloads. The February 2026 Wharton findings confirmed what local communities in drought-prone regions already knew: this industry has a water problem it hasn’t been forced to confront. Direct liquid cooling, smarter geographic siting, and mandatory disclosure requirements offer a credible path forward. If you’re choosing a cloud provider or deploying infrastructure, it’s worth checking whether they publish a WUE score. Every query has a water footprint. Making that footprint visible is a reasonable place to start.
🔖
- WHYY: Hyperscale data centers can consume up to 5 million gallons of drinking water per day
- Airsys North America: Average hyperscale data centers roughly 100 MW consume around 528,000 gallons per day
- TNFD: U.S. data centres consumed an estimated 66 billion litres of water in 2023
- EcoFlow: Some large data centers require up to 5 million gallons of water daily
- Bluefield Research: On-site water consumption projected to grow to 34 billion gallons by 2030
- IIGCC: Google reported withdrawing 37 billion litres of water, 80% consumed through evaporation
Photo by
Photo by
Photo by
Photo by
Photo by