Data Center Journal

VOLUME 44 | JUNE 2016

Issue link: http://cp.revolio.com/i/686088

Contents of this Issue

Navigation

Page 14 of 24

12 | THE DATA CENTER JOURNAL www.datacenterjournal.com Is a Liquid-Cooled Data Center in Your Future? By herB zIen M ost data centers in operation today defy logic. ey are cooled by circulating condi- tioned air around the data processing room and through the racks. Separate hot and cold aisles are maintained in an attempt to conserve energy. In most installations, cold air is forced up through holes in the floor. And humidity control is necessary to avoid condensation on IT equipment if too high or electrostatic discharge if too low. Air-cooled data centers are expensive to build and operate. Up to 15% of the total power supplied to a data center can be used to circulate air, and another 15% is used by rack and blade fans. Not only are fans inefficient, they fail. Fan cooling also limits power density, which is critical to reducing the white-space footprint as well as maintenance and infrastructure costs. Cooling with air creates problems beyond wasting energy and space. Contact between air and electronics leads to oxida- tion and tin whiskers. Pollutants in the air cause additional damage. Filters clog, resulting in overheating. Fans transmit vibrations that loosen solder joints, and they generate heat that must be dissipated. Many data centers operate at excessive noise levels from the fans, and OSHA regulations require earplugs. It gets even worse. Raising the temperature in a data center to reduce the need for mechanical refrigeration causes fans in the central air-handling system, CRAC units and device chassis to spin faster to move more air. Fan energy in- creases as the cube of the volume of air cir- culated, which means doubling the airflow requires eight times more energy. All of these problems can be avoided through liquid-cooled data centers. It's simple physics. Liquids cool electronics 1,000 times more effectively than air. Air is an insulator with negligible heat capacity or thermal mass. Warm air rises and cold air sinks, so if a data center has a raised floor and cold air is blown uphill, energy is unnecessarily being wasted to fight gravity. Ironically some of the earliest computer installations were liquid cooled, but the technology available then was expensive, messy, difficult to maintain and inconvenient, and water leaks had the potential to be catastrophic. Air condi- tioning for employee comfort was already installed in the building, so the simplest thing to do was expand the AC system to pick up the additional cooling load of the server rooms. Rather than isolating and solving the data center cooling problem, a bandage was applied—an easy fix. A lot has changed in the past few years. Energy waste and carbon footprints have become high-visibility issues. Rack power densities have increased, in some cases to the point where air cooling is bumping against thermodynamic limits. e bandage is becoming unstuck. Importantly, some liquid cooling technologies available now overcome the perceptions that carried over from the old days. Liquid-cooled IT devices can be neat, easy to maintain, scalable and inexpensive. In some cases it is possible to commercially recycle much of the input energy to heat buildings or domestic hot water, cutting the carbon footprint even further. ree technologies have emerged to cool electronic equipment with liquids: cold plates, in-row cooling and immersion in a dielectric fluid. Cold plates, originally designed to enable gamers to overclock their ma-

Articles in this issue

Links on this page

Archives of this issue

view archives of Data Center Journal - VOLUME 44 | JUNE 2016