液冷 vs 风冷:50kW GPU 机架指南(2025)

GPU机架达到50kW热限制。液冷技术实现21%节能,40%成本降低。AI基础设施团队面临瓶颈的必备指南。

液冷 vs 风冷:50kW GPU 机架指南(2025)

AI工作负载的指数级增长已将数据中心冷却推向了关键的转折点。随着GPU机架密度激增至50kW以上——下一代系统需求达到100kW甚至更高——传统风冷已达到其基本物理极限。这项综合分析揭示了行业如何通过先进的液冷技术来应对这一热量转型,实现10-21%的节能效果40%的冷却成本降低,并为AI革命提供必要的基础设施支撑。

当空气成为瓶颈时

空气冷却在高密度环境下的失效并非渐进式的——而是断崖式的。在每机架50kW的功率密度下,物理定律变得无情:冷却需要每分钟7,850立方英尺(CFM)的气流量,温差为20°F。将功率翻倍至100kW时,需要15,700 CFM——在仅2-4平方英寸的服务器进气口中产生飓风级别的风力。基本的散热方程式(Q = 0.318 × CFM × ΔT)揭示了一个无法克服的挑战:随着密度增加,所需气流量线性增长,但风扇功耗却与风扇转速的立方成正比。气流量增加10%需要多消耗33%的风扇功率,形成了能耗螺旋,使高密度空气冷却在经济和实用性上都变得不可行。

现实世界的证据证实了这些理论极限。一个有记录的案例显示,250个机架在仅6kW功率下,当冷却系统失效时,温度在75秒内从72°F升至90°F以上。为5-10kW平均机架密度设计的传统数据中心根本无法处理现代GPU工作负载。即使采用先进的冷热通道隔离,空气冷却在超过40kW时也难以应对,而未隔离的系统因热空气再循环会遭受20-40%的容量损失。专门为高密度设备创建的新ASHRAE H1环境等级将允许温度限制在18-22°C范围内——这是GPU规模下空气冷却无法维持的温度范围。

液体冷却技术改变了可能性的边界。

向液体冷却的转变不仅仅是增量改进——这是对散热方式的根本性重新构想。水的传热系数比空气高出3,500倍,使得100kW+机架的冷却能力从非凡变为常规。

直接芯片冷却引领着这一变革,冷板配备微通道(27-100微米)直接附着在处理器上。系统采用40°C供水和50°C回水运行,通过液体移除70-75%的机架热量,同时保持1.02-1.03的部分PUE。现代实现方案支持每芯片1.5kW+,9kW服务器的流量为每分钟13升。剩余的25-30%热量——来自内存、驱动器和辅助组件——仍需要空气冷却,这使得混合系统成为大多数部署的实用选择。

浸没式冷却进一步突破了界限,将整台服务器浸没在介电流体中。使用矿物油的单相系统成本为每加仑50-100美元,可持续支持每机架200kW。双相系统通过沸腾和冷凝承诺卓越的传热性能,但面临挑战:氟碳流体成本为每加仑500-1000美元,而3M因环境考虑将于2025年停产,这已冻结了技术采用。该技术的复杂性——密封外壳、空化风险和PFAS法规——限制了其在专业应用中的部署。

冷却剂分配单元(CDU)构成了液体冷却基础设施的骨干。现代设备从7kW机架式系统到CoolIT的CHx2000等2,000kW+巨型系统不等。领先供应商——Vertiv、Schneider Electric、Motivair和CoolIT——提供具有N+1冗余、50微米过滤和负载匹配变频驱动的解决方案。CDU市场2024年价值10亿美元,预计到2031年将达到36亿美元(20.5%复合年增长率),反映了液体冷却的快速采用。

改造的艺术与经济学

将现有数据中心过渡到液冷需要精心策划。最成功的方法是采用分阶段迁移:从1-2个高密度机架开始,扩展到一行,然后根据需求进行扩容。目前出现了三种主要的改造路径:利用现有空调的液转气CDU、可为每个机架散热高达40kW的后门热交换器,以及实现最高效率的芯片直冷解决方案。

基础设施改造是主要挑战。电力基础设施往往成为限制因素——设计用于5-10kW平均负载的设施无法支持50kW+的机架,无论冷却能力如何。管道系统在架空地板环境中需要仔细的CFD建模,或在平板结构中进行顶部安装并配备滴水盘。地板承载能力,特别是对于浸没式系统,可能会超出老旧设施的结构承载能力。

成本分析显示,尽管初期投资较高,但经济效益令人信服。加州能源委员会的一项研究记录了一套完整的液冷系统,覆盖17个机架中的1,200台服务器,总成本为470,557美元,即每台服务器392美元,包括设施改造费用。年节能355 MWh(按0.11美元/kWh计算节省39,155美元),简单回收期为12年,但优化实施可实现2-5年的投资回报。Schneider Electric的分析显示,通过4倍机架压缩可节省14%的资本支出,而运营节省包括数据中心总功耗降低10.2%,以及总使用效率提升15.5%

在混合环境中,集成挑战成倍增加。即使是"全液冷"设施也需要20-30%的风冷容量用于辅助组件。控制系统必须协调多种冷却技术,既要监控机架入口温度,也要监控供水条件。冗余变得至关重要——后门热交换器在打开进行维护时必须切换到风冷,而芯片直冷系统在满负载下的缓冲时间不到10秒。

从试点到生产

实际部署展示了液冷技术的成熟度。Meta 在大规模采用方面处于领先地位,在超过 4000 万平方英尺的数据中心空间中实施空气辅助液冷系统。他们的 Catalina 机架设计支持 140kW 功耗和 72 个 GPU,而设施范围的液冷部署目标是在 2025 年初完成。这一转型需要报废多个在建数据中心以进行 AI 优化重新设计,预计新架构将带来 31% 的成本节省

Google 在液冷 TPU 方面长达七年的历程为行业提供了最全面的数据集。他们在千兆瓦规模下部署了 2000+ 个 TPU Pod 的闭环系统,实现了 99.999% 的正常运行时间,同时展示了比空气高 30 倍的热导率。他们的第五代 CDU 设计 Project Deschutes 将贡献给开放计算项目,加速全行业的采用。

Microsoft 通过在生产环境中使用两相浸没式冷却突破了界限,使用在 122°F—比水低 50°C 的温度下沸腾的介电流体。该技术能够减少 5-15% 的服务器功耗,同时消除冷却风扇。他们承诺 到 2024 年减少 95% 的用水量,推动闭环零蒸发系统的创新。

CoreWeave 这样的专业提供商展示了 AI 工作负载的液冷技术。他们计划到 2024 年底部署 4000 个 GPU,正在实现 130kW 机架密度,系统利用率比竞争对手高 20%。他们的轨道优化设计通过提高可靠性节省了 310 万 GPU 小时,在 60 天内部署 H100 集群。

满足AI加速器的散热需求

GPU规格揭示了为什么液冷已成为强制要求。NVIDIA H100 SXM5的TDP为700W,需要液冷才能实现最佳性能。H200保持相同的功耗包络,同时提供141GB的HBM3e内存和4.8TB/s带宽——比之前高1.4倍的带宽,这会产生相应的热量。即将推出的B200进一步突破了边界:液冷版本为1,200W,风冷版本为1,000W,20 PFLOPS的FP4性能需要精密的热管理。

GB200 NVL72——在单个机架中集成72个Blackwell GPU和36个Grace CPU——代表了风冷可行性的极限。在140kW的机架功耗下,它需要通过新开发的冷板和250kW CDU进行强制液冷。系统级考虑使复杂性加剧:NVSwitch互连每个增加10-15W功耗,而高速内存和电源传输系统贡献了大量额外热量。

JetCool的技术分析显示了显著的性能差异:他们的H100 SmartPlate实现了0.021°C/W的热阻,芯片运行温度比风冷替代方案低35°C,同时支持60°C进水温度。这种温度降低理论上可将GPU寿命延长8倍,同时实现持续的最大性能——这对于持续数周的AI训练运行至关重要。

## 2030年路线图

行业正站在一个转型点上,最佳实践正在迅速演变为必需要求。ASHRAE新的H1环境等级(推荐18-22°C)承认传统指导原则无法满足AI工作负载的需求。Open Compute Project的液冷标准推动了互操作性,而其浸没式冷却要求Rev. 2.10版建立了新兴技术的认证流程。

两相浸没式冷却尽管目前面临挑战,但在2025-2027年主流采用方面显示出前景。市场预测显示,从2024年的3.75亿美元增长到2032年的12亿美元,这得益于优异的传热性能,可支持每芯片1,500W+的功耗。Accelsius NeuCool等创新技术以及3M停产流体的替代方案在保持性能的同时解决了环境问题。

AI驱动的优化带来即时回报。Google DeepMind的实施通过实时学习实现了40%的冷却能耗降低,而Siemens的White Space Cooling Optimization和类似平台正在普及。这些系统可以预测故障、优化冷却剂化学成分,并根据工作负载模式动态调整——91%的供应商预计这些功能将在五年内变得无处不在。

废热回收将负担转化为资产。Stockholm Data Parks已经利用数据中心废热为10,000户家庭供暖,目标是到2035年满足城市10%的供暖需求。监管压力加速了采用:德国要求到2028年实现20%的热量再利用,而加州Title 24要求在新建筑中配备回收基础设施。热泵技术将30-40°C的废热提升至70-80°C用于区域供暖,从以前废弃的能源中创造收入流。

进行转型

成功部署液冷需要多维度的战略规划。组织应该从简单的液体到空气CDU开始,以降低准入门槛,但必须首先评估电力基础设施——无论采用何种冷却技术,电力容量不足都会导致改造可行性失效。从1-2个机柜试点开始可以在扩展前积累经验,同时保持空气冷却专业知识对混合运营仍然至关重要。

财务建模必须考虑整体系统价值。虽然初始投资范围为每kW冷却容量$1,000到$2,000,但运营节省是复合的:优化实施中设施功耗减少27%,相比传统系统冷却能耗节省30%,更重要的是,能够部署空气冷却无法实现的创收AI工作负载。领先的实施通过精心设计实现了不到2年的投资回收期:绕过低效的冷水机组集成可节省20-30%,而专注于最高密度应用可最大化回报。

技术团队需要新的能力。除了传统HVAC知识外,员工还必须了解冷却剂化学、泄漏响应协议和集成控制系统。供应商合作伙伴关系至关重要——专业组件的24/7支持和6个月间隔的定期预防性维护成为运营必需品。安全协议扩展到包括介电流体处理和压力系统管理。

市场信号显示出压倒性的势头。数据中心液冷市场从49亿美元(2024年)增长到预计的213亿美元(2030年),复合年增长率为27.6%。到2025-2026年,单相直接芯片冷却将成为AI工作负载的标准,而两相浸没式冷却将在2027年达到主流采用。到2030年,1MW机柜将需要先进液冷作为标准配置,而非例外。

结论

物理学原理很明确:风冷已经达到了极限。在50-100kW机架密度下,基本的热力学约束使得液冷不仅仅是更优选择,而是必须的。这一转变代表了数据中心历史上最重大的基础设施变革,需要新技能、大量投资和运营转型。然而,其带来的好处——10-21%的能源节约、40%的冷却成本降低、8倍的可靠性提升,以及最关键的部署下一代AI基础设施的能力——使得这种演进不可避免。今天掌握液冷技术的组织将推动明天的AI突破——而那些延迟转型的将在行业向更高计算密度冲刺的过程中落后。我们已经遇到了热力学壁垒;液冷就是我们突破这一壁垒的方式。

参考文献

ACM Digital Library. "Energy-efficient LLM Training in GPU datacenters with Immersion Cooling Systems." Proceedings of the 16th ACM International Conference on Future and Sustainable Energy Systems. 2025. https://dl.acm.org/doi/10.1145/3679240.3734609.

AMAX. "Comparing NVIDIA Blackwell Configurations." 2025. https://www.amax.com/comparing-nvidia-blackwell-configurations/.

———. "Top 5 Considerations for Deploying NVIDIA Blackwell." 2025. https://www.amax.com/top-5-considerations-for-deploying-nvidia-blackwell/.

arXiv. "[1309.4887] iDataCool: HPC with Hot-Water Cooling and Energy Reuse." 2013. https://ar5iv.labs.arxiv.org/html/1309.4887.

———. "[1709.05077] Transforming Cooling Optimization for Green Data Center via Deep Reinforcement Learning." 2017. https://ar5iv.labs.arxiv.org/html/1709.05077.

Attom. "Ashrae's New Thermal Guideline Update: A New High Density Trend." Expert Green Prefab Data Centers. 2025. https://attom.tech/ashraes-new-thermal-guideline-update-a-new-high-density-trend/.

Chilldyne. "High-power liquid cooling design: direct-to-chip solution requirements for 500 kW Racks." Chilldyne | Liquid Cooling. July 29, 2024. https://chilldyne.com/2024/07/29/high-power-liquid-cooling-design-direct-to-chip-solution-requirements-for-500-kw-racks/.

Compass Datacenters. "What Is Data Center Cooling?" 2025. https://www.compassdatacenters.com/data-center-cooling/.

Converge Digest. "Meta Outlines AI Infrastructure Upgrades at OCP Summit 2024." 2024. https://convergedigest.com/meta-outlinesai-infrastructure-upgrades-at-ocp-summit-2024/.

Core Winner LTD. "Comprehensive Guide to Liquid Cooling: The Future of High-Performance Data Centers and AI Deployments." 2025. https://www.corewinner.com/en/blog/detail/52.

CoreWeave. "Building AI Clusters for Enterprises 2025." 2025. https://www.coreweave.com/blog/building-ai-clusters-for-enterprises-2025.

———. "GPUs for AI Models and Innovation." 2025. https://www.coreweave.com/products/gpu-compute.

Cyber Defense Advisors. "AI-Driven Predictive Maintenance: The Future of Data Center Reliability." 2025. https://cyberdefenseadvisors.com/ai-driven-predictive-maintenance-the-future-of-data-center-reliability/.

Data Center Catalog. "Meta Plans Shift to Liquid Cooling for its Data Center Infrastructure." 2022. https://datacentercatalog.com/news/2022/meta-plans-shift-to-liquid-cooling-for-its-data-center-infrastructure.

Data Center Dynamics. "An introduction to liquid cooling in the data center." 2025. https://www.datacenterdynamics.com/en/analysis/an-introduction-to-liquid-cooling-in-the-data-center/.

———. "Hyperscalers prepare for 1MW racks at OCP EMEA; Google announces new CDU." 2025. https://www.datacenterdynamics.com/en/news/hyperscalers-prepare-for-1mw-racks-at-ocp-emea-google-announces-new-cdu/.

———. "New ASHRAE guidelines challenge efficiency drive." 2025. https://www.datacenterdynamics.com/en/opinions/new-ashrae-guidelines-challenge-efficiency-drive/.

———. "Nvidia's CEO confirms upcoming system will be liquid cooled." 2025. https://www.datacenterdynamics.com/en/news/nvidias-ceo-confirms-next-dgx-will-be-liquid-cooled/.

———. "Optimizing data center efficiency with direct-to-chip liquid cooling." 2025. https://www.datacenterdynamics.com/en/opinions/optimizing-data-center-efficiency-with-direct-to-chip-liquid-cooling/.

———. "Two-phase cooling will be hit by EPA rules and 3M's exit from PFAS 'forever chemicals'." 2025. https://www.datacenterdynamics.com/en/news/two-phase-cooling-will-be-hit-by-epa-rules-and-3ms-exit-from-pfas-forever-chemicals/.

Data Center Frontier. "8 Trends That Will Shape the Data Center Industry In 2025." 2025. https://www.datacenterfrontier.com/cloud/article/55253151/8-trends-that-will-shape-the-data-center-industry-in-2025.

———. "Best Practices for Deploying Liquid Cooled Servers in Your Data Center." 2025. https://www.datacenterfrontier.com/sponsored/article/55138161/best-practices-for-deploying-liquid-cooled-servers-in-your-data-center.

———. "Google Developing New 'Climate Conscious' Cooling Tech to Save Water." 2025. https://www.datacenterfrontier.com/cooling/article/33001080/google-developing-new-climate-conscious-cooling-tech-to-save-water.

———. "Google Shifts to Liquid Cooling for AI Data Crunching." 2025. https://www.datacenterfrontier.com/cloud/article/11430207/google-shifts-to-liquid-cooling-for-ai-data-crunching.

———. "Meta Plans Shift to Liquid Cooling for its Data Center Infrastructure." 2025. https://www.datacenterfrontier.com/cooling/article/11436915/meta-plans-shift-to-liquid-cooling-for-its-data-center-infrastructure.

———. "Meta Previews New Data Center Design for an AI-Powered Future." 2025. https://www.datacenterfrontier.com/data-center-design/article/33005296/meta-previews-new-data-center-design-for-an-ai-powered-future.

———. "OCP 2024 Spotlight: Meta Debuts 140 kW Liquid-Cooled AI Rack; Google Eyes Robotics to Muscle Hyperscaler GPUs." 2024. https://www.datacenterfrontier.com/hyperscale/article/55238148/ocp-2024-spotlight-meta-shows-off-140-kw-liquid-cooled-ai-rack-google-eyes-robotics-to-muscle-hyperscaler-gpu-placement.

———. "Pushing the Boundaries of Air Cooling in High Density Environments." 2025. https://www.datacenterfrontier.com/special-reports/article/11427279/pushing-the-boundaries-of-air-cooling-in-high-density-environments.

———. "Report: Meta Plans Shift to Liquid Cooling in AI-Centric Data Center Redesign." 2025. https://www.datacenterfrontier.com/cooling/article/33004107/report-meta-plans-shift-to-liquid-cooling-in-ai-centric-data-center-redesign.

———. "The Importance of Liquid Cooling to the Open Compute Project (OCP)." 2025. https://www.datacenterfrontier.com/sponsored/article/55134348/the-importance-of-liquid-cooling-to-the-open-compute-project-ocp.

———. "Waste Heat Utilization is the Data Center Industry's Next Step Toward Net-Zero Energy." 2025. https://www.datacenterfrontier.com/voices-of-the-industry/article/11428787/waste-heat-utilization-is-the-data-center-industrys-next-step-toward-net-zero-energy.

———. "ZutaCore's HyperCool Liquid Cooling Technology to Support NVIDIA's Advanced H100 and H200 GPUs for Sustainable AI." 2024. https://www.datacenterfrontier.com/press-releases/press-release/33038994/zutacores-hypercool-liquid-cooling-technology-to-support-nvidias-advanced-h100-and-h200-gpus-for-sustainable-ai.

Data Center Knowledge. "Data Center Retrofit Strategies." 2025. https://www.datacenterknowledge.com/infrastructure/data-center-retrofit-strategies.

———. "Hybrid Cooling: The Bridge to Full Liquid Cooling in Data Centers." 2025. https://www.datacenterknowledge.com/cooling/hybrid-cooling-the-bridge-to-full-liquid-cooling-in-data-centers.

Data Centre Review. "Making the most of data centre waste heat." June 2024. https://datacentrereview.com/2024/06/making-the-most-of-data-centre-waste-heat/.

Datacenters. "CoreWeave's Role in Google and OpenAI's Cloud Partnership Redefines AI Infrastructure." 2025. https://www.datacenters.com/news/coreweave-s-strategic-role-in-google-and-openai-s-cloud-collaboration.

Dell. "When to Move from Air Cooling to Liquid Cooling for Your Data Center." 2025. https://www.dell.com/en-us/blog/when-to-move-from-air-cooling-to-liquid-cooling-for-your-data-center/.

Digital Infra Network. "Google's megawatt move for AI: Revamping power and cooling." 2025. https://digitalinfranetwork.com/news/google-ocp-400v-liquid-cooling/.

Enconnex. "Data Center Liquid Cooling vs. Air Cooling." 2025. https://blog.enconnex.com/data-center-liquid-cooling-vs-air-cooling.

Engineering at Meta. "Meta's open AI hardware vision." October 15, 2024. https://engineering.fb.com/2024/10/15/data-infrastructure/metas-open-ai-hardware-vision/.

Fortune Business Insights. "Two-Phase Data Center Liquid Immersion Cooling Market, 2032." 2025. https://www.fortunebusinessinsights.com/two-phase-data-center-liquid-immersion-cooling-market-113122.

Google Cloud. "Enabling 1 MW IT racks and liquid cooling at OCP EMEA Summit." Google Cloud Blog. 2025. https://cloud.google.com/blog/topics/systems/enabling-1-mw-it-racks-and-liquid-cooling-at-ocp-emea-summit.

GR Cooling. "Exploring Advanced Liquid Cooling: Immersion vs. Direct-to-Chip Cooling." 2025. https://www.grcooling.com/blog/exploring-advanced-liquid-cooling/.

———. "Two-Phase Versus Single-Phase Immersion Cooling." 2025. https://www.grcooling.com/blog/two-phase-versus-single-phase-immersion-cooling/.

HDR. "Direct-To-Chip Liquid Cooling." 2025. https://www.hdrinc.com/insights/direct-chip-liquid-cooling.

HiRef. "Hybrid Rooms: the combined solution for air and liquid cooling in data centers." 2025. https://hiref.com/news/hybrid-rooms-data-centers.

HPCwire. "H100 Fading: Nvidia Touts 2024 Hardware with H200." November 13, 2023. https://www.hpcwire.com/2023/11/13/h100-fading-nvidia-touts-2024-hardware-with-h200/.

IDTechEx. "Thermal Management for Data Centers 2025-2035: Technologies, Markets, and Opportunities." 2025. https://www.idtechex.com/en/research-report/thermal-management-for-data-centers/1036.

JetCool. "Direct Liquid Cooling vs. Immersion Cooling for Data Centers." 2025. https://jetcool.com/post/five-reasons-water-cooling-is-better-than-immersion-cooling/.

———. "Liquid Cooling System for NVIDIA H100 GPU." 2025. https://jetcool.com/h100/.

Maroonmonkeys. "CDU." 2025. https://www.maroonmonkeys.com/motivair/cdu.html.

Microsoft. "Project Natick Phase 2." 2025. https://natick.research.microsoft.com/.

Microsoft News. "To cool datacenter servers, Microsoft turns to boiling liquid." 2025. https://news.microsoft.com/source/features/innovation/datacenter-liquid-cooling/.

Nortek Data Center Cooling Solutions. "Waste Heat Utilization is the Data Center Industry's Next Step Toward Net-Zero Energy." 2025. https://www.nortekdatacenter.com/waste-heat-utilization-is-the-data-center-industrys-next-step-toward-net-zero-energy/.

NVIDIA. "H200 Tensor Core GPU." 2025. https://www.nvidia.com/en-us/data-center/h200/.

Open Compute Project. "Open Compute Project Foundation Expands Its Open Systems for AI Initiative." 2025. https://www.opencompute.org/blog/open-compute-project-foundation-expands-its-open-systems-for-ai-initiative.

P&S Intelligence. "Immersion Cooling Market Size, Share & Trends Analysis, 2032." 2025. https://www.psmarketresearch.com/market-analysis/immersion-cooling-market.

PR Newswire. "Supermicro Introduces Rack Scale Plug-and-Play Liquid-Cooled AI SuperClusters for NVIDIA Blackwell and NVIDIA HGX H100/H200." 2024. https://www.prnewswire.com/news-releases/supermicro-introduces-rack-scale-plug-and-play-liquid-cooled-ai-superclusters-for-nvidia-blackwell-and-nvidia-hgx-h100h200--radical-innovations-in-the-ai-era-to-make-liquid-cooling-free-with-a-bonus-302163611.html.

———. "ZutaCore's HyperCool Liquid Cooling Technology to Support NVIDIA's Advanced H100 and H200 GPUs for Sustainable AI." 2024. https://www.prnewswire.com/news-releases/zutacores-hypercool-liquid-cooling-technology-to-support-nvidias-advanced-h100-and-h200-gpus-for-sustainable-ai-302087410.html.

Rittal. "What is Direct to Chip Cooling – and Is Liquid Cooling in your Future?" 2025. https://www.rittal.com/us-en_US/Company/Rittal-Stories/What-is-Direct-to-Chip-Cooling-and-Is-Liquid-Cooling-in-your-Future.

ScienceDirect. "Liquid cooling of data centers: A necessity facing challenges." 2024. https://www.sciencedirect.com/science/article/abs/pii/S1359431124007804.

SemiAnalysis. "Datacenter Anatomy Part 1: Electrical Systems." October 14, 2024. https://semianalysis.com/2024/10/14/datacenter-anatomy-part-1-electrical/.

———. "Datacenter Anatomy Part 2 – Cooling Systems." February 13, 2025. https://semianalysis.com/2025/02/13/datacenter-anatomy-part-2-cooling-systems/.

———. "Multi-Datacenter Training: OpenAI's Ambitious Plan To Beat Google's Infrastructure." September 4, 2024. https://semianalysis.com/2024/09/04/multi-datacenter-training-openais/.

TechPowerUp. "NVIDIA H100 PCIe 80 GB Specs." TechPowerUp GPU Database. 2025. https://www.techpowerup.com/gpu-specs/h100-pcie-80-gb.c3899.

TechTarget. "Liquid Cooling vs. Air Cooling in the Data Center." 2025. https://www.techtarget.com/searchdatacenter/feature/Liquid-cooling-vs-air-cooling-in-the-data-center.

Unisys. "How leading LLM developers are fueling the liquid cooling boom." 2025. https://www.unisys.com/blog-post/dws/how-leading-llm-developers-are-fueling-the-liquid-cooling-boom/.

Upsite Technologies. "How Rack Density and Delta T Impact Your Airflow Management Strategy." 2025. https://www.upsite.com/blog/rack-density-delta-t-impact-airflow-management-strategy/.

———. "When to Retrofit the Data Center to Accommodate AI, and When Not to." 2025. https://www.upsite.com/blog/when-to-retrofit-the-data-center-to-accommodate-ai-and-when-not-to/.

Uptime Institute. "Data Center Cooling Best Practices." 2025. https://journal.uptimeinstitute.com/implementing-data-center-cooling-best-practices/.

———. "Performance expectations of liquid cooling need a reality check." Uptime Institute Blog. 2025. https://journal.uptimeinstitute.com/performance-expectations-of-liquid-cooling-need-a-reality-check/.

Utility Dive. "The 2025 outlook for data center cooling." 2025. https://www.utilitydive.com/news/2025-outlook-data-center-cooling-electricity-demand-ai-dual-phase-direct-to-chip-energy-efficiency/738120/.

Vertiv. "Deploying liquid cooling in data centers: Installing and managing coolant distribution units (CDUs)." 2025. https://www.vertiv.com/en-us/about/news-and-insights/articles/blog-posts/deploying-liquid-cooling-in-data-centers-installing-and-managing-coolant-distribution-units-cdus/.

———. "Liquid and Immersion Cooling Options for Data Centers." 2025. https://www.vertiv.com/en-us/solutions/learn-about/liquid-cooling-options-for-data-centers/.

———. "Liquid cooling options for data centers." 2025. https://www.vertiv.com/en-us/solutions/learn-about/liquid-cooling-options-for-data-centers/.

———. "Quantifying the Impact on PUE and Energy Consumption When Introducing Liquid Cooling Into an Air-cooled Data Center." 2025. https://www.vertiv.com/en-emea/about/news-and-insights/articles/blog-posts/quantifying-data-center-pue-when-introducing-liquid-cooling/.

———. "Understanding direct-to-chip cooling in HPC infrastructure: A deep dive into liquid cooling." 2025. https://www.vertiv.com/en-emea/about/news-and-insights/articles/educational-articles/understanding-direct-to-chip-cooling-in-hpc-infrastructure-a-deep-dive-into-liquid-cooling/.

———. "Vertiv™ CoolPhase CDU | High Density Solutions." 2025. https://www.vertiv.com/en-us/products-catalog/thermal-management/high-density-solutions/vertiv-coolphase-cdu/.

WGI. "Cooling Down AI and Data Centers." 2025. https://wginc.com/cooling-down-ai-and-data-centers/.

申请报价_

告诉我们您的项目需求,我们将在72小时内回复。

> 传输完成

请求已收到_

感谢您的咨询。我们的团队将审核您的请求并在72小时内回复。

排队处理中