Liquid Cooling vs Air: The 50kW GPU Rack Guide (2025)

The exponential growth of AI workloads has pushed data center cooling to a critical inflection point. As GPU rack densities surge past 50kW—with next-generation systems demanding 100kW and beyond—traditional air cooling has reached its fundamental physical limits. This comprehensive analysis reveals how the industry is navigating this thermal transformation through advanced liquid cooling technologies, delivering 10-21% energy savings40% reduction in cooling costs, and enabling the infrastructure necessary for the AI revolution.

When air becomes the bottleneck

The failure of air cooling at high densities isn't gradual—it's a cliff. At 50kW per rack, the physics become unforgiving: cooling requires 7,850 cubic feet per minute (CFM) of airflow at a 20°F temperature differential. Double that to 100kW, and you need 15,700 CFM—creating hurricane-force winds through server intakes measuring just 2-4 square inches. The fundamental heat removal equation (Q = 0.318 × CFM × ΔT) reveals an insurmountable challenge: as density increases, the required airflow scales linearly, but fan power consumption scales with the cube of fan speed. A 10% increase in airflow demands 33% more fan power, creating an energy consumption spiral that makes high-density air cooling economically and practically impossible.

Real-world evidence confirms these theoretical limits. One documented case showed 250 racks at just 6kW going from 72°F to over 90°F in 75 seconds when cooling failed. Traditional data centers designed for 5-10kW average rack densities simply cannot handle modern GPU workloads. Even with advanced hot/cold aisle containment, air cooling struggles beyond 40kW, while uncontained systems suffer 20-40% capacity losses from hot air recirculation. The new ASHRAE H1 environmental class, explicitly created for high-density equipment, restricts allowable temperatures to 18-22°C—a range impossible to maintain with air cooling at GPU scales.

Liquid cooling technologies transform the possible.

The transition to liquid cooling represents more than incremental improvement—it's a fundamental reimagining of heat removal. Water's heat transfer coefficient is 3,500 times greater than air, enabling cooling capacities that make 100kW+ racks routine rather than remarkable.

Direct-to-chip cooling leads the transformation, with cold plates featuring microchannels (27-100 microns) attached directly to processors. Operating with supply water at 40°C and return at 50°C, these systems remove 70-75% of rack heat through liquid while maintaining 1.02-1.03 partial PUE. Modern implementations support 1.5kW+ per chip with flow rates of 13 liters per minute for a 9kW server. The remaining 25-30% of heat—from memory, drives, and auxiliary components—still requires air cooling, making these hybrid systems the practical choice for most deployments.

Immersion cooling pushes boundaries further, submerging entire servers in dielectric fluids. Single-phase systems using mineral oils cost $50-100 per gallon and consistently support 200kW per rack. Two-phase systems promise superior heat transfer through boiling and condensation, but face challenges: fluorocarbon fluids cost $500-1000 per gallon, and 3M's discontinuation of production by 2025 due to environmental concerns has frozen adoption. The technology's complexity—sealed enclosures, cavitation risks, and PFAS regulations—limits deployment to specialized applications.

Coolant Distribution Units (CDUs) form the backbone of liquid cooling infrastructure. Modern units range from 7kW rack-mount systems to 2,000kW+ giants like CoolIT's CHx2000. Leading vendors—Vertiv, Schneider Electric, Motivair, and CoolIT—offer solutions with N+1 redundancy, 50-micron filtration, and variable frequency drives for load matching. The CDU market, valued at $1 billion in 2024, is projected to reach $3.6 billion by 2031 (20.5% CAGR), reflecting liquid cooling's rapid adoption.

The art and economics of retrofitting

Transitioning existing data centers to liquid cooling requires careful orchestration. The most successful approach follows a phased migration: starting with 1-2 high-density racks, expanding to a row, then scaling based on demand. Three primary retrofit paths have emerged: liquid-to-air CDUs that leverage existing air conditioning, rear-door heat exchangers that can cool up to 40kW per rack, and direct-to-chip solutions for maximum efficiency.

Infrastructure modifications present the primary challenge. Power infrastructure often becomes the limiting factor—facilities designed for 5-10kW average loads cannot support 50kW+ racks regardless of cooling capability. Plumbing requires careful CFD modeling in raised-floor environments or overhead installation with drip pans in slab construction. Floor loading, particularly for immersion systems, may exceed structural capacity in older facilities.

Cost analysis reveals compelling economics despite high initial investment. A California Energy Commission study documented a complete liquid cooling system for 1,200 servers across 17 racks at a total cost of $470,557, or $392 per server, including facility modifications. Annual energy savings of 355 MWh ($39,155 at $0.11/kWh) yield a 12-year simple payback, though optimized implementations achieve 2-5 year returns. Schneider Electric's analysis shows 14% capital savings through 4x rack compaction, while operational savings include a 10.2% reduction in total data center power and a 15.5% improvement in Total Usage Effectiveness.

Integration challenges multiply in hybrid environments. Even "fully liquid-cooled" facilities require 20-30% air cooling capacity for auxiliary components. Control systems must coordinate multiple cooling technologies, monitoring both rack inlet temperatures and supply water conditions. Redundancy becomes critical—rear-door heat exchangers must fail over to air cooling when opened for service, while direct-to-chip systems have less than 10 seconds of ride-through time at full load.

From pilots to production

Real-world deployments demonstrate liquid cooling's maturity. Meta leads adoption at scale, implementing Air-Assisted Liquid Cooling across 40+ million square feet of data center space. Their Catalina rack design supports 140kW with 72 GPUs, while facility-wide liquid cooling deployment targets completion by early 2025. The transformation required scrapping multiple in-construction data centers for AI-optimized redesigns, expecting 31% cost savings from the new architecture.

Google's seven-year journey with liquid-cooled TPUs provides the industry's most comprehensive dataset. Deploying closed-loop systems across 2000+ TPU Pods at gigawatt scale, they've achieved 99.999% uptime while demonstrating 30x greater thermal conductivity than air. Their fifth-generation CDU design, Project Deschutes, will be contributed to the Open Compute Project, accelerating industry-wide adoption.

Microsoft pushes boundaries with two-phase immersion cooling in production, using dielectric fluids that boil at 122°F—50°C lower than water. The technology enables 5-15% server power reduction while eliminating cooling fans. Their commitment to 95% water use reduction by 2024 drives innovation in closed-loop, zero-evaporation systems.

Specialized providers like CoreWeave demonstrate liquid cooling for AI workloads. Planning 4,000 GPU deployments by end-2024, they're achieving 130kW rack densities with 20% better system utilization than competitors. Their rail-optimized designs save 3.1 million GPU hours through improved reliability, deploying H100 clusters in under 60 days.

Meeting the thermal demands of AI accelerators

GPU specifications reveal why liquid cooling has become mandatory. The NVIDIA H100 SXM5 operates at 700W TDP, requiring liquid cooling for optimal performance. The H200 maintains the same power envelope while delivering 141GB of HBM3e memory at 4.8TB/s—1.4x more bandwidth, which generates proportional heat. The upcoming B200 pushes boundaries further: 1,200W for liquid-cooled variants versus 1,000W for air-cooled, with 20 PFLOPS FP4 performance demanding sophisticated thermal management.

The GB200 NVL72—packing 72 Blackwell GPUs and 36 Grace CPUs in a single rack—represents the endpoint of air cooling viability. At 140kW rack power, it requires mandatory liquid cooling through newly developed cold plates and 250kW CDUs. System-level considerations compound complexity: NVSwitch interconnects add 10-15W each, while high-speed memory and power delivery systems contribute substantial additional heat.

Technical analysis by JetCool demonstrates stark performance differences: their H100 SmartPlate achieves 0.021°C/W thermal resistance, running chips 35°C cooler than air alternatives while supporting 60°C inlet temperatures. This temperature reduction theoretically extends GPU lifespan 8x while enabling sustained maximum performance—critical for multi-week AI training runs.

The roadmap to 2030

The industry stands at a transformation point where best practices rapidly evolve into requirements. ASHRAE's new H1 environmental class (18-22°C recommended) acknowledges that traditional guidelines cannot accommodate AI workloads. The Open Compute Project's liquid cooling standards drive interoperability, while their Immersion Requirements Rev. 2.10 establishes qualification processes for emerging technologies.

Two-phase immersion cooling, despite current challenges, shows promise for 2025-2027 mainstream adoption. Market projections indicate growth from $375 million (2024) to $1.2 billion (2032), driven by superior heat transfer enabling 1,500W+ per chip. Innovations like Accelsius NeuCool and alternatives to discontinued 3M fluids address environmental concerns while maintaining performance.

AI-driven optimization delivers immediate returns. Google DeepMind's implementation achieved 40% cooling energy reduction through real-time learning, while Siemens' White Space Cooling Optimization and similar platforms proliferate. These systems predict failures, optimize coolant chemistry, and adjust dynamically to workload patterns—capabilities that 91% of vendors expect to be ubiquitous within five years.

Waste heat recovery transforms liability into an asset. Stockholm Data Parks already heats 10,000 households with data center waste, targeting 10% of city heating by 2035. Regulatory pressure accelerates adoption: Germany mandates 20% heat reuse by 2028, while California Title 24 requires recovery infrastructure in new construction. Heat pump technology elevates 30-40°C waste heat to 70-80°C for district heating, creating revenue streams from formerly discarded energy.

Making the transition

Success in liquid cooling deployment requires strategic planning across multiple dimensions. Organizations should begin with straightforward liquid-to-air CDUs for lowest-barrier entry, but must assess power infrastructure first—inadequate electrical capacity disqualifies retrofit feasibility regardless of cooling technology. Starting with 1-2 rack pilots allows learning before scaling, while maintaining air cooling expertise remains critical for hybrid operations.

Financial modeling must account for total system value. While initial investment ranges from $1,000 to $2,000 per kW of cooling capacity, operational savings compound: 27% facility power reduction in optimized implementations, 30% cooling energy savings versus conventional systems, and critically, the ability to deploy revenue-generating AI workloads, impossible with air cooling. Leading implementations achieve sub-2-year paybacks through careful design: bypassing inefficient chiller integration saves 20-30%, while focusing on the highest-density applications maximizes return.

Technical teams require new competencies. Beyond traditional HVAC knowledge, staff must understand coolant chemistry, leak response protocols, and integrated control systems. Vendor partnerships prove essential—24/7 support for specialized components and regular preventive maintenance at 6-month intervals become operational necessities. Safety protocols expand to include dielectric fluid handling and pressure system management.

The market signals overwhelming momentum. Data center liquid cooling grows from $4.9 billion (2024) to projected $21.3 billion (2030) at 27.6% CAGR. Single-phase direct-to-chip cooling becomes standard for AI workloads by 2025-2026, while two-phase immersion reaches mainstream adoption by 2027. By 2030, 1MW racks will require advanced liquid cooling as standard, not an exception.

Conclusion

The physics are clear: air cooling has reached its limits. At 50-100kW rack densities, fundamental thermodynamic constraints make liquid cooling not just preferable but mandatory. The transition represents the most significant infrastructure shift in data center history, requiring new skills, considerable investment, and operational transformation. Yet the benefits—10-21% energy savings, 40% cooling cost reduction, 8x reliability improvement, and most critically, the ability to deploy next-generation AI infrastructure—make this evolution inevitable. Organizations that master liquid cooling today will power the AI breakthroughs of tomorrow—those who delay fall behind as the industry races toward ever-higher computational densities. We've reached the thermal wall; liquid cooling is how we break through.

References

ACM Digital Library. "Energy-efficient LLM Training in GPU datacenters with Immersion Cooling Systems." Proceedings of the 16th ACM International Conference on Future and Sustainable Energy Systems. 2025. https://dl.acm.org/doi/10.1145/3679240.3734609.

AMAX. "Comparing NVIDIA Blackwell Configurations." 2025. https://www.amax.com/comparing-nvidia-blackwell-configurations/.

———. "Top 5 Considerations for Deploying NVIDIA Blackwell." 2025. https://www.amax.com/top-5-considerations-for-deploying-nvidia-blackwell/.

arXiv. "[1309.4887] iDataCool: HPC with Hot-Water Cooling and Energy Reuse." 2013. https://ar5iv.labs.arxiv.org/html/1309.4887.

———. "[1709.05077] Transforming Cooling Optimization for Green Data Center via Deep Reinforcement Learning." 2017. https://ar5iv.labs.arxiv.org/html/1709.05077.

Attom. "Ashrae's New Thermal Guideline Update: A New High Density Trend." Expert Green Prefab Data Centers. 2025. https://attom.tech/ashraes-new-thermal-guideline-update-a-new-high-density-trend/.

Chilldyne. "High-power liquid cooling design: direct-to-chip solution requirements for 500 kW Racks." Chilldyne | Liquid Cooling. July 29, 2024. https://chilldyne.com/2024/07/29/high-power-liquid-cooling-design-direct-to-chip-solution-requirements-for-500-kw-racks/.

Compass Datacenters. "What Is Data Center Cooling?" 2025. https://www.compassdatacenters.com/data-center-cooling/.

Converge Digest. "Meta Outlines AI Infrastructure Upgrades at OCP Summit 2024." 2024. https://convergedigest.com/meta-outlinesai-infrastructure-upgrades-at-ocp-summit-2024/.

Core Winner LTD. "Comprehensive Guide to Liquid Cooling: The Future of High-Performance Data Centers and AI Deployments." 2025. https://www.corewinner.com/en/blog/detail/52.

CoreWeave. "Building AI Clusters for Enterprises 2025." 2025. https://www.coreweave.com/blog/building-ai-clusters-for-enterprises-2025.

———. "GPUs for AI Models and Innovation." 2025. https://www.coreweave.com/products/gpu-compute.

Cyber Defense Advisors. "AI-Driven Predictive Maintenance: The Future of Data Center Reliability." 2025. https://cyberdefenseadvisors.com/ai-driven-predictive-maintenance-the-future-of-data-center-reliability/.

Data Center Catalog. "Meta Plans Shift to Liquid Cooling for its Data Center Infrastructure." 2022. https://datacentercatalog.com/news/2022/meta-plans-shift-to-liquid-cooling-for-its-data-center-infrastructure.

Data Center Dynamics. "An introduction to liquid cooling in the data center." 2025. https://www.datacenterdynamics.com/en/analysis/an-introduction-to-liquid-cooling-in-the-data-center/.

———. "Hyperscalers prepare for 1MW racks at OCP EMEA; Google announces new CDU." 2025. https://www.datacenterdynamics.com/en/news/hyperscalers-prepare-for-1mw-racks-at-ocp-emea-google-announces-new-cdu/.

———. "New ASHRAE guidelines challenge efficiency drive." 2025. https://www.datacenterdynamics.com/en/opinions/new-ashrae-guidelines-challenge-efficiency-drive/.

———. "Nvidia's CEO confirms upcoming system will be liquid cooled." 2025. https://www.datacenterdynamics.com/en/news/nvidias-ceo-confirms-next-dgx-will-be-liquid-cooled/.

———. "Optimizing data center efficiency with direct-to-chip liquid cooling." 2025. https://www.datacenterdynamics.com/en/opinions/optimizing-data-center-efficiency-with-direct-to-chip-liquid-cooling/.

———. "Two-phase cooling will be hit by EPA rules and 3M's exit from PFAS 'forever chemicals'." 2025. https://www.datacenterdynamics.com/en/news/two-phase-cooling-will-be-hit-by-epa-rules-and-3ms-exit-from-pfas-forever-chemicals/.

Data Center Frontier. "8 Trends That Will Shape the Data Center Industry In 2025." 2025. https://www.datacenterfrontier.com/cloud/article/55253151/8-trends-that-will-shape-the-data-center-industry-in-2025.

———. "Best Practices for Deploying Liquid Cooled Servers in Your Data Center." 2025. https://www.datacenterfrontier.com/sponsored/article/55138161/best-practices-for-deploying-liquid-cooled-servers-in-your-data-center.

———. "Google Developing New 'Climate Conscious' Cooling Tech to Save Water." 2025. https://www.datacenterfrontier.com/cooling/article/33001080/google-developing-new-climate-conscious-cooling-tech-to-save-water.

———. "Google Shifts to Liquid Cooling for AI Data Crunching." 2025. https://www.datacenterfrontier.com/cloud/article/11430207/google-shifts-to-liquid-cooling-for-ai-data-crunching.

———. "Meta Plans Shift to Liquid Cooling for its Data Center Infrastructure." 2025. https://www.datacenterfrontier.com/cooling/article/11436915/meta-plans-shift-to-liquid-cooling-for-its-data-center-infrastructure.

———. "Meta Previews New Data Center Design for an AI-Powered Future." 2025. https://www.datacenterfrontier.com/data-center-design/article/33005296/meta-previews-new-data-center-design-for-an-ai-powered-future.

———. "OCP 2024 Spotlight: Meta Debuts 140 kW Liquid-Cooled AI Rack; Google Eyes Robotics to Muscle Hyperscaler GPUs." 2024. https://www.datacenterfrontier.com/hyperscale/article/55238148/ocp-2024-spotlight-meta-shows-off-140-kw-liquid-cooled-ai-rack-google-eyes-robotics-to-muscle-hyperscaler-gpu-placement.

———. "Pushing the Boundaries of Air Cooling in High Density Environments." 2025. https://www.datacenterfrontier.com/special-reports/article/11427279/pushing-the-boundaries-of-air-cooling-in-high-density-environments.

———. "Report: Meta Plans Shift to Liquid Cooling in AI-Centric Data Center Redesign." 2025. https://www.datacenterfrontier.com/cooling/article/33004107/report-meta-plans-shift-to-liquid-cooling-in-ai-centric-data-center-redesign.

———. "The Importance of Liquid Cooling to the Open Compute Project (OCP)." 2025. https://www.datacenterfrontier.com/sponsored/article/55134348/the-importance-of-liquid-cooling-to-the-open-compute-project-ocp.

———. "Waste Heat Utilization is the Data Center Industry's Next Step Toward Net-Zero Energy." 2025. https://www.datacenterfrontier.com/voices-of-the-industry/article/11428787/waste-heat-utilization-is-the-data-center-industrys-next-step-toward-net-zero-energy.

———. "ZutaCore's HyperCool Liquid Cooling Technology to Support NVIDIA's Advanced H100 and H200 GPUs for Sustainable AI." 2024. https://www.datacenterfrontier.com/press-releases/press-release/33038994/zutacores-hypercool-liquid-cooling-technology-to-support-nvidias-advanced-h100-and-h200-gpus-for-sustainable-ai.

Data Center Knowledge. "Data Center Retrofit Strategies." 2025. https://www.datacenterknowledge.com/infrastructure/data-center-retrofit-strategies.

———. "Hybrid Cooling: The Bridge to Full Liquid Cooling in Data Centers." 2025. https://www.datacenterknowledge.com/cooling/hybrid-cooling-the-bridge-to-full-liquid-cooling-in-data-centers.

Data Centre Review. "Making the most of data centre waste heat." June 2024. https://datacentrereview.com/2024/06/making-the-most-of-data-centre-waste-heat/.

Datacenters. "CoreWeave's Role in Google and OpenAI's Cloud Partnership Redefines AI Infrastructure." 2025. https://www.datacenters.com/news/coreweave-s-strategic-role-in-google-and-openai-s-cloud-collaboration.

Dell. "When to Move from Air Cooling to Liquid Cooling for Your Data Center." 2025. https://www.dell.com/en-us/blog/when-to-move-from-air-cooling-to-liquid-cooling-for-your-data-center/.

Digital Infra Network. "Google's megawatt move for AI: Revamping power and cooling." 2025. https://digitalinfranetwork.com/news/google-ocp-400v-liquid-cooling/.

Enconnex. "Data Center Liquid Cooling vs. Air Cooling." 2025. https://blog.enconnex.com/data-center-liquid-cooling-vs-air-cooling.

Engineering at Meta. "Meta's open AI hardware vision." October 15, 2024. https://engineering.fb.com/2024/10/15/data-infrastructure/metas-open-ai-hardware-vision/.

Fortune Business Insights. "Two-Phase Data Center Liquid Immersion Cooling Market, 2032." 2025. https://www.fortunebusinessinsights.com/two-phase-data-center-liquid-immersion-cooling-market-113122.

Google Cloud. "Enabling 1 MW IT racks and liquid cooling at OCP EMEA Summit." Google Cloud Blog. 2025. https://cloud.google.com/blog/topics/systems/enabling-1-mw-it-racks-and-liquid-cooling-at-ocp-emea-summit.

GR Cooling. "Exploring Advanced Liquid Cooling: Immersion vs. Direct-to-Chip Cooling." 2025. https://www.grcooling.com/blog/exploring-advanced-liquid-cooling/.

———. "Two-Phase Versus Single-Phase Immersion Cooling." 2025. https://www.grcooling.com/blog/two-phase-versus-single-phase-immersion-cooling/.

HDR. "Direct-To-Chip Liquid Cooling." 2025. https://www.hdrinc.com/insights/direct-chip-liquid-cooling.

HiRef. "Hybrid Rooms: the combined solution for air and liquid cooling in data centers." 2025. https://hiref.com/news/hybrid-rooms-data-centers.

HPCwire. "H100 Fading: Nvidia Touts 2024 Hardware with H200." November 13, 2023. https://www.hpcwire.com/2023/11/13/h100-fading-nvidia-touts-2024-hardware-with-h200/.

IDTechEx. "Thermal Management for Data Centers 2025-2035: Technologies, Markets, and Opportunities." 2025. https://www.idtechex.com/en/research-report/thermal-management-for-data-centers/1036.

JetCool. "Direct Liquid Cooling vs. Immersion Cooling for Data Centers." 2025. https://jetcool.com/post/five-reasons-water-cooling-is-better-than-immersion-cooling/.

———. "Liquid Cooling System for NVIDIA H100 GPU." 2025. https://jetcool.com/h100/.

Maroonmonkeys. "CDU." 2025. https://www.maroonmonkeys.com/motivair/cdu.html.

Microsoft. "Project Natick Phase 2." 2025. https://natick.research.microsoft.com/.

Microsoft News. "To cool datacenter servers, Microsoft turns to boiling liquid." 2025. https://news.microsoft.com/source/features/innovation/datacenter-liquid-cooling/.

Nortek Data Center Cooling Solutions. "Waste Heat Utilization is the Data Center Industry's Next Step Toward Net-Zero Energy." 2025. https://www.nortekdatacenter.com/waste-heat-utilization-is-the-data-center-industrys-next-step-toward-net-zero-energy/.

NVIDIA. "H200 Tensor Core GPU." 2025. https://www.nvidia.com/en-us/data-center/h200/.

Open Compute Project. "Open Compute Project Foundation Expands Its Open Systems for AI Initiative." 2025. https://www.opencompute.org/blog/open-compute-project-foundation-expands-its-open-systems-for-ai-initiative.

P&S Intelligence. "Immersion Cooling Market Size, Share & Trends Analysis, 2032." 2025. https://www.psmarketresearch.com/market-analysis/immersion-cooling-market.

PR Newswire. "Supermicro Introduces Rack Scale Plug-and-Play Liquid-Cooled AI SuperClusters for NVIDIA Blackwell and NVIDIA HGX H100/H200." 2024. https://www.prnewswire.com/news-releases/supermicro-introduces-rack-scale-plug-and-play-liquid-cooled-ai-superclusters-for-nvidia-blackwell-and-nvidia-hgx-h100h200--radical-innovations-in-the-ai-era-to-make-liquid-cooling-free-with-a-bonus-302163611.html.

———. "ZutaCore's HyperCool Liquid Cooling Technology to Support NVIDIA's Advanced H100 and H200 GPUs for Sustainable AI." 2024. https://www.prnewswire.com/news-releases/zutacores-hypercool-liquid-cooling-technology-to-support-nvidias-advanced-h100-and-h200-gpus-for-sustainable-ai-302087410.html.

Rittal. "What is Direct to Chip Cooling – and Is Liquid Cooling in your Future?" 2025. https://www.rittal.com/us-en_US/Company/Rittal-Stories/What-is-Direct-to-Chip-Cooling-and-Is-Liquid-Cooling-in-your-Future.

ScienceDirect. "Liquid cooling of data centers: A necessity facing challenges." 2024. https://www.sciencedirect.com/science/article/abs/pii/S1359431124007804.

SemiAnalysis. "Datacenter Anatomy Part 1: Electrical Systems." October 14, 2024. https://semianalysis.com/2024/10/14/datacenter-anatomy-part-1-electrical/.

———. "Datacenter Anatomy Part 2 – Cooling Systems." February 13, 2025. https://semianalysis.com/2025/02/13/datacenter-anatomy-part-2-cooling-systems/.

———. "Multi-Datacenter Training: OpenAI's Ambitious Plan To Beat Google's Infrastructure." September 4, 2024. https://semianalysis.com/2024/09/04/multi-datacenter-training-openais/.

TechPowerUp. "NVIDIA H100 PCIe 80 GB Specs." TechPowerUp GPU Database. 2025. https://www.techpowerup.com/gpu-specs/h100-pcie-80-gb.c3899.

TechTarget. "Liquid Cooling vs. Air Cooling in the Data Center." 2025. https://www.techtarget.com/searchdatacenter/feature/Liquid-cooling-vs-air-cooling-in-the-data-center.

Unisys. "How leading LLM developers are fueling the liquid cooling boom." 2025. https://www.unisys.com/blog-post/dws/how-leading-llm-developers-are-fueling-the-liquid-cooling-boom/.

Upsite Technologies. "How Rack Density and Delta T Impact Your Airflow Management Strategy." 2025. https://www.upsite.com/blog/rack-density-delta-t-impact-airflow-management-strategy/.

———. "When to Retrofit the Data Center to Accommodate AI, and When Not to." 2025. https://www.upsite.com/blog/when-to-retrofit-the-data-center-to-accommodate-ai-and-when-not-to/.

Uptime Institute. "Data Center Cooling Best Practices." 2025. https://journal.uptimeinstitute.com/implementing-data-center-cooling-best-practices/.

———. "Performance expectations of liquid cooling need a reality check." Uptime Institute Blog. 2025. https://journal.uptimeinstitute.com/performance-expectations-of-liquid-cooling-need-a-reality-check/.

Utility Dive. "The 2025 outlook for data center cooling." 2025. https://www.utilitydive.com/news/2025-outlook-data-center-cooling-electricity-demand-ai-dual-phase-direct-to-chip-energy-efficiency/738120/.

Vertiv. "Deploying liquid cooling in data centers: Installing and managing coolant distribution units (CDUs)." 2025. https://www.vertiv.com/en-us/about/news-and-insights/articles/blog-posts/deploying-liquid-cooling-in-data-centers-installing-and-managing-coolant-distribution-units-cdus/.

———. "Liquid and Immersion Cooling Options for Data Centers." 2025. https://www.vertiv.com/en-us/solutions/learn-about/liquid-cooling-options-for-data-centers/.

———. "Liquid cooling options for data centers." 2025. https://www.vertiv.com/en-us/solutions/learn-about/liquid-cooling-options-for-data-centers/.

———. "Quantifying the Impact on PUE and Energy Consumption When Introducing Liquid Cooling Into an Air-cooled Data Center." 2025. https://www.vertiv.com/en-emea/about/news-and-insights/articles/blog-posts/quantifying-data-center-pue-when-introducing-liquid-cooling/.

———. "Understanding direct-to-chip cooling in HPC infrastructure: A deep dive into liquid cooling." 2025. https://www.vertiv.com/en-emea/about/news-and-insights/articles/educational-articles/understanding-direct-to-chip-cooling-in-hpc-infrastructure-a-deep-dive-into-liquid-cooling/.

———. "Vertiv™ CoolPhase CDU | High Density Solutions." 2025. https://www.vertiv.com/en-us/products-catalog/thermal-management/high-density-solutions/vertiv-coolphase-cdu/.

WGI. "Cooling Down AI and Data Centers." 2025. https://wginc.com/cooling-down-ai-and-data-centers/.

Previous
Previous

South Korea's $65 Billion AI Revolution: How Samsung and SK Hynix Lead Asia's GPU Infrastructure Boom

Next
Next

Indonesia's AI Revolution: How Southeast Asia's Largest Economy Became a Global AI Powerhouse