Energy gets the headlines, but water may be AI’s more acute crisis. By 2030, AI data centers could consume 450 million gallons of water daily—equivalent to the daily water use of 8 million people. And half of US data centers already sit in water-stressed regions.
The Water Stress dimension of AI infrastructure reveals a resource constraint that’s harder to solve than electricity:
The Hidden Cost Per Query:
Each AI prompt requires approximately 0.26 mL of water for data center cooling. Microscopic—until multiplied by billions. ChatGPT’s 2.5 billion daily queries alone consume significant water resources.
Why Water Is Different From Power:
- Electricity can be transmitted: Power from distant sources reaches data centers via the grid
- Water is local: Cooling water must come from nearby sources
- Electricity can be generated: Build more power plants, add renewables
- Water is finite: Aquifers deplete, rivers have competing uses
Geographic Implications:
Data center location decisions increasingly depend on water availability, not just power costs. The Southwest US—home to major tech clusters—faces structural water shortages. This may force AI infrastructure toward the Great Lakes region, Pacific Northwest, or international locations with water abundance.
Innovation Pressure:
Water constraints are driving investment in alternative cooling technologies: immersion cooling, geothermal systems, and air-cooled designs that eliminate water dependency entirely. Companies solving the water problem gain geographic flexibility others lack.
This analysis examines the water dimension of The Business Engineer’s Infrastructure Economics framework. Read the full analysis: The Economics of an AI Prompt →









