Water Usage Efficiency: AI Data Center Cooling Without Crisis

Microsoft's upcoming data centers will use closed-loop, zero-water evaporation cooling that eliminates the need for evaporative water entirely.¹ Once filled at construction, the system recirculates

Water Usage Efficiency: AI Data Center Cooling Without Crisis

December 2025 Update: Microsoft deploying closed-loop, zero-water evaporation cooling—eliminating evaporative water and reducing usage by 125M+ liters per facility annually. AI DCs consuming 10-50x more cooling water than traditional server farms. Google facilities averaging 550,000 gallons daily. GPT-3 training evaporated 700,000 liters freshwater. Zero-water designs becoming industry direction.

Microsoft's upcoming data centers will use closed-loop, zero-water evaporation cooling that eliminates the need for evaporative water entirely.¹ Once filled at construction, the system recirculates coolant continuously, reducing annual water use by more than 125 million liters per facility. The design represents a fundamental shift in how AI infrastructure approaches water consumption—moving from acceptance of high water usage to engineering it out of existence.

AI data centers consume 10-50 times more cooling water than traditional server farms.² The scale creates genuine sustainability concerns: Google's data centers average 550,000 gallons daily per facility, and training GPT-3 alone evaporated 700,000 liters of freshwater.³ Organizations building AI infrastructure face mounting pressure from regulators, communities, and their own sustainability commitments to address water consumption. Understanding Water Usage Effectiveness (WUE) and the technologies driving zero-water cooling helps navigate this evolving landscape.

Understanding WUE

The Green Grid introduced Water Usage Effectiveness in 2011 as the standardized metric for data center water consumption.⁴ Like Power Usage Effectiveness (PUE) for energy, WUE provides a benchmark for comparing water efficiency across facilities.

WUE calculation

WUE measures liters of water consumed per kilowatt-hour of IT equipment energy:

WUE = Annual Site Water Usage (liters) / Annual IT Equipment Energy (kWh)

The formula captures all water consumption—cooling tower makeup water, humidification, and any other operational water use—relative to the actual compute delivered.

Example calculation:

Facility water usage: 50 million liters/year
IT energy consumption: 100 million kWh/year
WUE = 50,000,000 / 100,000,000 = 0.5 L/kWh

WUE benchmarks

Ideal WUE: 0.0 L/kWh Air-cooled facilities using no evaporative cooling can achieve zero water usage. The tradeoff: higher energy consumption and PUE.

Industry average: 1.8-1.9 L/kWh Most data centers fall in this range, using evaporative cooling that trades water for energy efficiency.⁵

Best-in-class: 0.3-0.7 L/kWh NREL's data center achieves 0.7 L/kWh alongside 1.06 PUE, demonstrating that low WUE doesn't require sacrificing energy efficiency.⁶

Regional variation: Microsoft's WUE varies dramatically by location—Arizona operates at 1.52 L/kWh while Singapore achieves 0.02 L/kWh.⁷ Climate, water availability, and cooling technology all influence achievable WUE.

The WUE-PUE tradeoff

WUE and PUE often move inversely:

Air cooling: Zero water usage (WUE = 0) but higher energy consumption (PUE 1.4-1.8)

Evaporative cooling: High water usage (WUE 1.5-2.5) but better energy efficiency (PUE 1.1-1.3)

Liquid cooling: Minimal water usage in closed-loop designs (WUE near 0) with excellent energy efficiency (PUE 1.05-1.2)

Liquid cooling breaks the traditional tradeoff, enabling both low WUE and low PUE—which explains its rapid adoption for AI infrastructure.

AI's water consumption challenge

AI workloads create unprecedented water demands through the combination of higher power density and continuous operation.

Scale of consumption

Hyperscaler water usage grew dramatically with AI expansion:

Google: 24,227 megalitres consumed in 2023—three times Microsoft's usage and growing 17% annually.⁸

Microsoft: 7,844 megalitres in 2023, with 41% consumed in water-stressed areas. Global operations used nearly 6.4 million cubic meters, a 34% year-over-year increase.⁹

Industry projection: Water usage expected to reach 1,068 billion liters annually by 2028—an 11-fold increase from current levels.¹⁰

AI-specific factors

AI workloads drive higher water consumption through several mechanisms:

Power density: GPU racks operate at 50-135 kW, versus 10-20 kW for traditional servers. Higher heat output requires more aggressive cooling.

Continuous operation: Training runs lasting weeks or months generate sustained heat loads without the intermittent idle periods of typical enterprise workloads.

Inference growth: Production AI deployments run inference continuously, creating 24/7 cooling demands that accumulate water consumption.

Per-query impact: Researchers at UC Riverside estimate each 100-word AI prompt uses approximately 519 milliliters of water—roughly one bottle per interaction.¹¹

Geographic concentration

Water stress compounds in regions with heavy AI infrastructure investment:

Arizona: Major hyperscaler presence in desert climate with limited water resources. Microsoft's Arizona facilities operate at 1.52 L/kWh WUE—among their highest globally.

Oregon: Data center proliferation strains water resources in communities dependent on the same sources for agriculture and residential use.

Global expansion: Hyperscalers face criticism for building water-intensive facilities in drought-prone regions while pursuing water-positive commitments.¹²

Cooling technologies and water efficiency

Traditional evaporative cooling

Evaporative cooling remains the dominant technology in existing data centers:

How it works: Water absorbs heat as it evaporates, transferring thermal energy from the facility to the atmosphere. Cooling towers continuously evaporate water to reject heat from the data center.

Water consumption: Evaporative systems consume 1.5-3.0 L/kWh depending on climate and efficiency.

Energy advantage: Evaporative cooling reduces compressor work, improving PUE by 15-30% versus mechanical cooling in suitable climates.

Limitations: High water consumption, makeup water treatment requirements, and legionella risk from cooling towers.

Air cooling alternatives

Air-cooled facilities eliminate water consumption but sacrifice energy efficiency:

Mechanical cooling: Compressor-based systems reject heat without water evaporation. Higher energy consumption (PUE 1.4+) but zero water usage.

Free cooling: Using ambient air directly when outdoor temperatures permit. Effective in cool climates but limited applicability for AI infrastructure in high-density configurations.

Best for: Water-stressed regions where water conservation outweighs energy efficiency considerations.

Direct-to-chip liquid cooling

Liquid cooling represents the breakthrough technology enabling both water and energy efficiency:

How it works: Cold plates mount directly onto CPUs, GPUs, memory modules, and voltage regulators. Closed-loop systems circulate coolant through these plates, removing heat at the source before it dissipates into the air.¹³

Water consumption: Closed-loop designs use no water in normal operation. The system fills once at construction and recirculates continuously.

Energy efficiency: Liquid cooling achieves PUE below 1.2 while eliminating water consumption entirely.¹⁴

NVIDIA implementation: The GB200 NVL72 rack-scale liquid-cooled system delivers 300x better water efficiency than traditional air-cooled architectures.¹⁵

Two-phase cooling

Advanced liquid cooling uses phase change for maximum efficiency:

How it works: Specially formulated dielectric fluid (from suppliers like Honeywell and Chemours) boils at temperatures as low as 18°C. The phase change absorbs significant heat energy, providing more efficient cooling than single-phase liquid systems.¹⁶

Waterless operation: ZutaCore's HyperCool technology removes heat directly at the source, eliminating water usage and cutting energy consumption by up to 82%.¹⁷

Safety advantages: Dielectric fluids won't damage electronics if leaked, unlike water-based coolants.

Immersion cooling

Full immersion provides the ultimate heat density solution:

Single-phase immersion: Servers submerge in dielectric fluid that absorbs heat through convection. No water required.

Two-phase immersion: Servers submerge in low-boiling-point fluid that actively boils adjacent to heat-producing components, providing extremely efficient cooling.

Adoption: Microsoft, Google, and Meta have all implemented immersion cooling for highest-density AI training infrastructure.

Hyperscaler water strategies

Microsoft's water-positive path

Microsoft committed to becoming water-positive by 2030—replenishing more water than consumed across global operations:¹⁸

Zero-water cooling deployment: Closed-loop chip-level liquid cooling eliminates evaporative water entirely. Currently testing in Phoenix, Arizona, and Mt. Pleasant, Wisconsin, with operations expected in 2026. By late 2027, zero-water evaporation becomes the standard across new data centers.

Facility impact: Each zero-water facility reduces annual consumption by more than 125 million liters compared to evaporative designs.

Replenishment projects: Water restoration projects in water-stressed communities offset existing facility consumption.

2023 performance: 7,844 megalitres consumed, though 41% in water-stressed areas highlights the challenge of existing infrastructure.

Google's replenishment commitment

Google pledged to replenish 120% of water consumed by 2030:¹⁹

Operational efficiency: Improving cooling efficiency across existing facilities to reduce baseline consumption.

Watershed partnerships: Collaborating with communities and organizations to replenish water use and improve watershed health.

Technology investment: Supporting water security through technology and innovation beyond direct operations.

2023 consumption: 24,227 megalitres—the highest among major hyperscalers, reflecting Google's data center scale.

Meta's efficiency focus

Meta committed to water positivity by 2030 with emphasis on operational efficiency:²⁰

Construction practices: Using recycled water for construction and implementing best practices to reduce construction water needs.

Facility recycling: Recycling water within facilities multiple times before discharge.

Operational efficiency: Data centers account for most of Meta's water usage, making operational improvements the primary lever.

Lower baseline: 2,938 megalitres in 2023—significantly less than Google or Microsoft, reflecting different infrastructure scale.

AWS's late entry

AWS committed to water positivity by 2030 at re:Invent 2024:²¹

Direct-to-chip adoption: AWS deploys cold plates directly on chips with closed-loop circulation, eliminating water consumption increases from new AI infrastructure.

Engineered fluids: Using specially formulated cooling fluids rather than water, avoiding evaporation losses entirely.

Community replenishment: Returning more water to communities than direct operations consume.

Operational best practices

Measurement and monitoring

Effective water management requires comprehensive measurement:

Metering infrastructure: Install submeters for cooling towers, humidification systems, and any other water-consuming equipment. Monthly or annual aggregates provide more representative WUE than daily snapshots.²²

Real-time monitoring: Track water consumption alongside temperature, humidity, and IT load to identify optimization opportunities.

Baseline establishment: Document current WUE before implementing improvements to measure impact accurately.

Temperature and humidity optimization

Adjusting environmental parameters reduces water consumption:

Raise temperature setpoints: ASHRAE guidelines permit inlet temperatures up to 27°C for A1 equipment. Higher setpoints reduce cooling load and associated water consumption.

Widen humidity ranges: Reducing humidification requirements saves water directly. Modern equipment tolerates wider humidity ranges than legacy assumptions.

Impact: Combined temperature and humidity optimization can reduce water consumption 10-20% in evaporative-cooled facilities.

Water source alternatives

Reduce freshwater impact through alternative sources:

Recycled water: Many facilities can use non-potable recycled water for cooling tower makeup. Doesn't reduce WUE metric but improves sustainability.

Rainwater harvesting: Capturing precipitation for cooling system makeup reduces municipal water demand.

Greywater systems: Processing on-site wastewater for cooling applications where regulations permit.

Heat recovery

Capturing waste heat creates value from cooling operations:

District heating: Nordic facilities increasingly supply waste heat to district energy systems, warming nearby buildings while cooling servers.

Industrial processes: Adjacent facilities may use low-grade heat for manufacturing or agricultural applications.

Economic offset: Heat recovery revenue can offset cooling system costs, improving economics of efficient cooling investments.

Regulatory and community landscape

Emerging regulations

Water consumption faces increasing regulatory attention:

EU requirements: European data center sustainability reporting requirements include water consumption metrics.

US state regulations: Arizona, Oregon, and other water-stressed states implementing or considering data center water use limits.

Disclosure mandates: SEC climate disclosure rules and similar regulations require reporting of material water risks.

Community relations

Local opposition to data center water consumption has blocked or delayed projects:

Competing uses: Agricultural, residential, and industrial water users increasingly view data centers as competitors for scarce resources.

Aquifer concerns: Groundwater-dependent communities resist facilities that may drawdown aquifers or impact well levels.

Permit challenges: Projects in water-stressed regions face longer approval timelines and additional conditions.

Sustainability commitments

Corporate water goals drive technology adoption:

Water positive pledges: Major hyperscalers committed to replenishing more water than consumed by 2030.

Customer pressure: Enterprise customers increasingly evaluate supplier water practices in procurement decisions.

Investor scrutiny: ESG-focused investors assess water risk as material to data center operations.

Technology selection framework

Decision criteria

Evaluate cooling technology against multiple factors:

Water availability: Water-stressed regions favor air cooling or closed-loop liquid systems despite energy tradeoffs.

Power density: AI infrastructure exceeding 30kW per rack typically requires liquid cooling regardless of water considerations.

Climate: Cool climates enable free cooling and reduce both water and energy consumption. Hot climates force tradeoffs.

Existing infrastructure: Retrofitting liquid cooling into air-cooled facilities costs more than designing for liquid from the start.

Technology recommendations by scenario

New AI facility, water-stressed region: Closed-loop direct-to-chip liquid cooling. Zero water consumption with excellent energy efficiency. Higher capital cost offset by operational savings and regulatory simplicity.

New AI facility, water-abundant region: Direct-to-chip liquid cooling remains preferred for PUE advantages, even where water consumption is less constrained.

Existing facility upgrade: Evaluate rear-door heat exchangers or in-row liquid cooling as incremental improvements. Full liquid conversion may not justify cost for facilities with remaining useful life.

Colocation selection: Prioritize facilities with WUE disclosure, liquid cooling capability, and water-positive commitments. Request WUE data alongside PUE in RFPs.

Organizations planning sustainable AI infrastructure can leverage Introl's global deployment expertise for cooling technology selection and implementation across 257 locations worldwide.

The zero-water future

Microsoft's 2027 commitment to zero-water evaporation cooling across new data centers signals the industry direction. The technology works—closed-loop liquid cooling delivers both water efficiency and energy efficiency without compromise. The remaining barriers center on capital cost and retrofit complexity for existing facilities.

For new AI infrastructure, liquid cooling represents the clear path forward. Water consumption drops to near zero while PUE improves versus air-cooled alternatives. The combination satisfies sustainability commitments, regulatory requirements, and community expectations simultaneously.

Existing facilities face harder choices. Evaporative cooling remains economically attractive where water costs stay low and community relations allow continued consumption. Incremental improvements—temperature setpoint optimization, recycled water adoption, partial liquid cooling retrofits—reduce consumption without full system replacement.

The organizations that master water efficiency gain competitive advantages beyond sustainability metrics. Regulatory risk diminishes. Community opposition fades. Site selection expands to water-stressed regions previously off-limits. And the combination of zero-water cooling with renewable energy positions AI infrastructure for a future where resource constraints increasingly determine competitive positioning.

Water efficiency for AI data centers isn't optional—regulatory, community, and corporate pressures converge to make water consumption a strategic consideration alongside power, land, and connectivity. The technology exists to address it. The leading organizations deploy that technology proactively rather than waiting for constraints to force the transition.

References

  1. Data Center Dynamics. "Microsoft's upcoming data centers to use closed loop, zero-water evaporation design." August 2024. https://www.datacenterdynamics.com/en/news/microsofts-upcoming-data-centers-to-use-closed-loop-zero-water-evaporation-design/

  2. KETOS. "Conventional vs AI Data Center Cooling and Wastewater." 2025. https://ketos.co/conventional-vs-ai-data-center-cooling-options-and-how-much-wastewater-is-being-generated

  3. Stanford & the West. "Thirsty for power and water, AI-crunching data centers sprout across the West." 2025. https://andthewest.stanford.edu/2025/thirsty-for-power-and-water-ai-crunching-data-centers-sprout-across-the-west/

  4. Equinix Blog. "What Is Water Usage Effectiveness (WUE) in Data Centers?" November 13, 2024. https://blog.equinix.com/blog/2024/11/13/what-is-water-usage-effectiveness-wue-in-data-centers/

  5. Data Center Knowledge. "A Guide to Data Center Water Usage Effectiveness (WUE) and Best Practices." 2025. https://www.datacenterknowledge.com/cooling/a-guide-to-data-center-water-usage-effectiveness-wue-and-best-practices

  6. Department of Energy. "Cooling Water Efficiency Opportunities for Federal Data Centers." https://www.energy.gov/femp/cooling-water-efficiency-opportunities-federal-data-centers

  7. Lenovo StoryHub. "The world's AI generators: rethinking water usage in data centers to build a more sustainable future." 2025. https://news.lenovo.com/data-centers-worlds-ai-generators-water-usage/

  8. Asia Financial. "AI Data Centres Using Much More Water Than Expected." 2025. https://www.asiafinancial.com/ai-data-centres-using-much-more-water-than-expected

  9. Cloud Computing News. "Cloud's hidden cost: Data centre water consumption creates a global crisis." 2025. https://www.cloudcomputing-news.net/news/data-centre-water-consumption-crisis/

  10. Mongabay. "AI data center revolution sucks up world's energy, water, materials." November 2025. https://news.mongabay.com/2025/11/ai-data-center-revolution-sucks-up-worlds-energy-water-materials/

  11. IEEE Spectrum. "The Real Story on AI Water Usage at Data Centers." 2025. https://spectrum.ieee.org/ai-water-usage

  12. Business & Human Rights Resource Centre. "Amazon, Google, & Microsoft allegedly operating and expanding water-intensive data centres in some of the world's driest regions." 2025. https://www.business-humanrights.org/en/latest-news/amazon-google-microsoft-allegedly-operating-and-expanding-water-intensive-datacentres-in-some-of-the-worlds-driest-region/

  13. Vertiv. "Understanding direct-to-chip cooling in HPC infrastructure: A deep dive into liquid cooling." 2025. https://www.vertiv.com/en-us/about/news-and-insights/articles/educational-articles/understanding-direct-to-chip-cooling-in-hpc-infrastructure-a-deep-dive-into-liquid-cooling/

  14. Datacenters.com. "Why Liquid Cooling Is the New Standard for Data Centers in 2025." 2025. https://www.datacenters.com/news/why-liquid-cooling-is-becoming-the-data-center-standard

  15. NVIDIA Blog. "NVIDIA Blackwell Platform Boosts Water Efficiency by Over 300x." 2025. https://blogs.nvidia.com/blog/blackwell-platform-water-efficiency-liquid-cooling-data-centers-ai-factories/

  16. IEEE Spectrum. "Data Center Liquid Cooling: The AI Heat Solution." 2025. https://spectrum.ieee.org/data-center-liquid-cooling

  17. ZutaCore. "HyperCool - Server Liquid Cooling System." 2025. https://zutacore.com/solutions/

  18. Microsoft Datacenters. "Sustainability." 2025. https://datacenters.microsoft.com/sustainability/

  19. ESG Today. "Google Commits to be Water Positive, Replenishing More Water than Consumed by 2030." 2021. https://www.esgtoday.com/google-commits-to-be-water-positive-replenishing-more-water-than-consumed-by-2030/

  20. Meta Sustainability. "What Makes a Data Center Sustainable?" April 17, 2023. https://sustainability.atmeta.com/blog/2023/04/17/what-makes-a-data-center-sustainable/

  21. Data Center Dynamics. "AWS pledges to be water positive by 2030." 2024. https://www.datacenterdynamics.com/en/news/aws-pledges-to-be-water-positive-by-2030/

  22. Nlyte. "The Role of Water Usage Effectiveness (WUE) in Sustainable Data Centers." 2025. https://www.nlyte.com/blog/the-role-of-water-usage-effectiveness-wue-in-sustainable-data-centers/


Key takeaways

For strategic planners: - AI data centers consume 10-50x more water than traditional facilities; each 100-word AI prompt uses ~519ml water - Google: 24,227 megalitres/year (3x Microsoft); industry projection: 1,068B liters annually by 2028 (11x current) - Hyperscaler commitments: Microsoft zero-water by 2027, Google 120% replenishment by 2030, AWS water-positive by 2030

For infrastructure architects: - WUE benchmarks: ideal 0.0 L/kWh (air-cooled), industry avg 1.8-1.9 L/kWh, best-in-class 0.3-0.7 L/kWh (NREL achieves 0.7 L/kWh + 1.06 PUE) - NVIDIA GB200 NVL72 achieves 300x better water efficiency than air-cooled via closed-loop liquid cooling - Liquid cooling breaks WUE-PUE tradeoff: WUE near 0 + PUE 1.05-1.2 (vs air cooling WUE 0 but PUE 1.4-1.8)

For sustainability teams: - Microsoft regional WUE: Arizona 1.52 L/kWh vs Singapore 0.02 L/kWh—location dramatically impacts water consumption - Microsoft zero-water design: closed-loop chip-level liquid cooling eliminates evaporation; saves 125M liters/year per facility - ZutaCore HyperCool: eliminates water usage, cuts energy 82%; two-phase cooling uses dielectric fluid (safe if leaked)

For regulatory compliance: - EU sustainability reporting requires WUE disclosure; Arizona/Oregon implementing or considering DC water limits - 41% of Microsoft's water consumed in water-stressed areas; community opposition blocking projects in drought regions - 83% of enterprise customers consider sustainability in vendor selection; green colocation commands 5-10% premium


Request a Quote_

Tell us about your project and we'll respond within 72 hours.

> TRANSMISSION_COMPLETE

Request Received_

Thank you for your inquiry. Our team will review your request and respond within 72 hours.

QUEUED FOR PROCESSING