分享

数据中心余热利用引领风潮

 非著名问天 2024-05-10 发布于内蒙古

图片

Harnessing Waste Heat is the Latest Frontier in Data Center Efficiency

译 者 说

人工智能浪潮带动数据中心又一次大发展,如何更好地利用产生的热量将成为提升能效后,又一个有意义的方向,可能从开始就要规划数据中心如何与园区整合提升综合能效。


耗电量大的人工智能硬件不断增加,凸显了数据中心余热利用解决方案的必要性。

The increase in power-hungry AI hardware underscores the need for data centers to adopt waste heat reuse solutions.

图片

二十年来,数据中心从业者一直致力于提高能源效率,不断寻找降低能耗、更有效地冷却和降低运营成本的方法。在持续探索中,余热的捕获和再利用成为前沿的技术。

Data centers have been on an energy efficiency drive for two decades. They continue to find ways to lower energy usage, cool more effectively, and reduce operational costs. The latest frontier in this endless quest is the capture and reuse of waste heat.

随着人工智能应用的增加和液冷的采用,数据中心的余热一定会增加。这一增长趋势凸显了数据中心引入余热再利用解决方案的必要性,不仅可以提高能源效率,还可以减轻对环境的影响和降低运营成本。

The quantity of waste heat is only going to increase in data centers as AI applications are added and more liquid cooling is used. This growing trend underscores the need for data centers to adopt waste heat reuse solutions, not only to enhance energy efficiency but also to mitigate environmental impact and operational costs.Top of Form

ARPA-E COOLERCHIPS项目主任Peter de Bock说:“从芯片到设备的散热仍然具有挑战性。”

 “Heat rejection from chips to the facility remains challenging,” said Peter de Bock, Program Director for the ARPA-E COOLERCHIPS Program.

虽然将余热排出数据中心是一种方法,但利用废热发电、蒸汽、加热或甚至冷却更为有效和环保,这是一个充满活力的研究领域。

While channeling waste heat out of the facility is one approach, it is far more efficient and environmentally friendly to utilize waste heat to generate power, steam, heating, or even cooling. This is a vibrant area of research.

“通过寻找利用余热利用的方法,数据中心设施可以减少成本负担和碳足迹,” Amherst学院科学中心实验室和机械系统的总负责人Kyle Mangini说。

 “Facilities can reduce the cost burden and their carbon footprint by finding ways to harness waste heat,” said Kyle Mangini who looks after all laboratory and mechanical systems at the Amherst College Science Center.

成为多功能园区
Multifunction Campuses

Smith集团关键应用负责人Brian Rener认为,当数据中心规划在多功能建筑或园区环境中,会有大量的可持续性和效率提升机会。由于数据中心需要持续冷却,会从服务器和其他设备排出大量的热,这些热量可以通过各种方式加以利用,以降低整体能耗,并进一步降低PUE。

Brian Rener, Mission Critical Leader at SmithGroup, believes there are plenty of sustainability and efficiency opportunities when data centers are collocated within multifunction buildings or campus settings. Since data centers require continuous cooling, they expel large quantities of waste heat from servers and other equipment that can be harnessed in various ways to reduce overall energy consumption and further lower power usage effectiveness (PUE).

Rener表示:“我们预计未来几年数据中心的用电量将翻倍,大多数数据中心无法选择迁移到更靠北方的位置以便利用凉爽的室外空气来提高可持续性,可以通过在混合应用的环境,在园区内或更深入地融入当地社区,从余热回收利用中获得价值。”

“As we expect a doubling of data center electricity consumption over the next couple of years, most data centers don’t have the option of moving to more northerly locations to take advantage of cool outside air as a way to improve sustainability,” said Rener. “Alternatively, they can gain value from their waste heat by collocating in mixed-used environments, within campuses or becoming more embedded within local communities.”

图片

智慧建设
Build Smart

在电影《梦幻之地》中,Ray Kinsella (Kevin Costner饰) 被告知:“只要努力,就会有好结果。”他的确这样做到了---在爱荷华州的玉米地间。这在电影中可能是可行的,但Rener并不认为这是数据中心的最佳位置,这样的位置很难将余热价值最大化。相反,他的建议是寻求一个更大的园区,让数据中心成为更广泛的能源解决方案的一部分。

In the movie 'Field of Dreams,’ Ray Kinsella (Kevin Costner) is told to, “Build it and they will come.” He does so – in the middle of an Iowa cornfield. That may have worked in the movie, but Rener doesn’t see that as the best place for a data center. Such a location makes it hard to maximize the value of waste heat. Instead, his advice is to seek out a larger community and have the data center be part of a broader energy solution.

Rener引用了美国能源信息管理局(EIA)的一些数据来支持他的主张,这些数据显示,空间供暖是商业建筑中最大的单一能源使用方式,占32%。制冷仅占9%,但这个数字在南方各州会大得多。一个单独的大型建筑物消耗1MW的暖气。25个城市街区需要10MW,一个8,500,000平方英尺的商业地产仅供暖就需要100MW。现在谈回数据中心密度,高性能计算(HPC)和AI正在推动机架密度达到100 kW甚至更高。通道机架上挤满了处理器,处理器的性能比以往任何时候都要强大。

Rener backs up his assertion by citing some data from the US Energy Information Administration (EIA) that shows space heating to be the largest single energy end use in commercial buildings at 32%. Cooling was only 9% but that number would be much larger in Southern states. A single large building consumes 1 MW for heating. 25 city blocks require 10 MW, and an 8,500,000 square foot commercial property needs 100 MW for heating alone.Now factor in data center density. High-Performance Computing (HPC) and AI are driving rack densities to 100 kW and beyond. Aisles are being packed with processors with more power than ever.

Omdia首席分析师Shen Wang表示:“所有的处理器尺寸都变得越来越大,也越来越热。“2000年以来,处理器的功耗增长了4.6倍。”

“All processors are getting bigger and hotter,” said Shen Wang, principal analyst at Omdia. “Since 2000, the power consumption of processors has increased by 4.6 times.”

这无疑在电源和冷却方面带来了严重的问题。但是,由于芯片和服务器释放出大量更多的余热,这也带来了机会。

This certainly poses a serious problem in terms of power and cooling. But it also opens up an opportunity due to the sheer quantity of much hotter waste heat being emitted from chips and servers.

大学园区冷却
College Campus Cooling

密尔沃基工程学院在余热再利用方面提供了一个有用的案例。学院增加了一栋计算机科学大楼,里面有一台名为Rosie超级计算机。利用里面Nvidia GPU加速的超级计算机可帮助学生学习AI、无人机、机器人和自动驾驶车辆。虽然超级计算机室只占据了65,000平方英尺建筑中的1,500平方英尺,但它消耗了超过60%的能量。因此,该系统需要无缝集成到建筑和机械/电气/管道基础设施中。数据中心采用N+1冗余设计,即使在单个组件发生故障的情况下也能保持正常运行,这得益于多台后备发电机和冷却装置。

Milwaukee School of Engineering provides a useful case study in waste heat reuse. It added a computer science building with a supercomputer inside known as Rosie. This Nvidia GPU-accelerated supercomputer helps students study AI, drones, robotics, and autonomous vehicles. While the supercomputer room occupies only 1,500sq.ft of the 65,000sq.ft structure, it consumes over 60% of its energy. As such, the system needed to be seamlessly integrated into the building and mechanical/electrical/plumbing infrastructure. The data center is designed for N+1 redundancy, allowing it to remain functional even in the event of a single component failure, thanks to multiple backup generators and cooling units.

工程团队在学院建筑能源系统和超级计算机系统之间建立了互利共生的关系。例如,计算机房和教学楼在夏季使用相同的冷却系统,当该设施的冷冻水回水管道用于数据中心的供应。通过提高冷水回水量,整个建筑物的冷却效率得到了提高。在冬季,当教学楼不再需要机械冷却时,计算机设施通过专用的风冷式屋顶冷凝器和集成的自然冷却回路利用外部冷空气。

The engineering team developed a symbiotic relationship between the academic building energy systems and the supercomputer systems. For instance, the computer room and academic building use the same cooling system during summer months, when the facility’s chilled water return line is used for data center supply. By elevating the chilled water return, cooling efficiencies for the whole building were increased. During the winter, when the academic building no longer requires mechanical cooling, the computer facility utilizes cold outside air via dedicated air-cooled roof-top condensers and integrated free-cooling circuits.

Smith集团首席机械工程师Jamison Caldwell表示:“余热利用是提高数据中心和建筑效率的关键。

“The use of waste heat is key in raising data center and building efficiency,” said Jamison Caldwell, Principal Mechanical Engineer at SmithGroup.

除了寻找利用热通道余热的方法外,他指出液冷设备也会释放辐射热。他致力于能源再利用系数,这是一个显示再利用能源与总能耗函数的关系。Caldwell说:“有许多校园水平的废热的机会。

As well as finding ways to harness waste heat from hot aisles, he pointed out the liquid cooling equipment also emits radiant heat. He works on the energy reuse factor which is calculated as a function of reused energy versus total energy consumed.  “There are numerous campus-level opportunities for waste heat,” said Caldwell.  

美国国家可再生能源实验室的能源回收路径
NREL Energy Recovery Loop

美国国家可再生能源实验室(NREL)建立了一个能源系统集成设施(ESIF),旨在通过其基于超级计算机的数据中心来匹配其实验室和办公室的加热需求,从而使整个建筑物更加节能。实现了1.04的PUE,成为全球最高效的数据中心之一。

The National Renewable Energy Lab (NREL) built an energy systems integration facility (ESIF) that was designed to match the heating demands of its labs and offices with its supercomputer-based data center to make the entire building more energy efficient. It has achieved a PUE of 1.04, making it one of the most efficient data centers in the world.

“办公室供暖100%是通过废热再利用完成的,用水量减少了一半,” Caldwell说。“如果突然需要更多的热量,园区的热蒸汽是可以利用的。”

“100% of office heating is done through waste heat reuse, and water use has been cut in half,” said Caldwell. “If there is a sudden need for more heat, campus steam can be used.”

这是通过能量回收的水循环实现的,该循环跨越园区供热和制冷系统、超级计算系统和传统IT系统,并从液体和空气冷却系统收集废热。Caldwell将其描述为建筑层面的能量交换。他说:“液体冷却允许更高等级的循环热回收”。

This is achieved courtesy of an energy recovery water loop that spans campus heating and cooling systems, supercomputing systems, and legacy IT systems, and gathers waste heat from both liquid and air-cooling systems. Caldwell characterized this as building-level energy exchange.“Liquid cooling allows for higher grade hydronic heat recovery,” he said.

液冷可能会导致数据中心冷却效率的重大转变,并大幅降低PUE——,至少可以防止PUE随着机架密度的飙升而上升。Iceotope公司客户解决方案架构副总裁Jason Matteson指出,液体冷却的潜在收益可能会因为各种浪费而被浪费掉,包括缺乏热空气遏制、少装盲板、无法更换耗电量大的风扇等等。

Liquid cooling could result in a major shift in data center cooling efficiency and bring about drastically lower PUEs – or at least prevent PUEs from rising as rack density soars. But Jason Matteson, vice president of customer solutions architecture at Iceotope, pointed out that the potential gains from liquid cooling could be squandered due to various areas of waste, including a lack of hot air containment, missing blanking panels, failure to replace power-hogging fans, and more. 

Matteson表示:“大多数数据中心都存在大量的浪费,我们可以帮助消除这些浪费。

“There is a lot of waste present in most data centers that we can eliminate,” said Matteson.

图片



深 知 社


翻译:

Eric

DKV(DeepKnowledge Volunteer)计划创始成员

公众号声明:

    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约

    类似文章 更多