Share |

Seven innovative ways to cool a scientific computer

As much as 40% of a data center’s electricity bill is from cooling equipment. Computer centers around the world are trying to minimize this by using innovative methods of cooling. We look at seven.

1) Geothermal:

Image of geothermal cooling of the Olympus high-performance computer  at the Pacific Northwest National Laboratory.

This is the geothermal cooling setup for the Olympus high-performance computer. The Rear Door Heat Exchangers (RDHx) are overhead with their rear doors open. Image courtesy Ralph Wescott.

This uses a renewable energy source, such as thermal energy stored in the Earth, to regulate and cool a computer’s internal components.

In January 2012, the Olympus high-performance computer (HPC), at the Pacific Northwest National Laboratory (PNNL), Richland, Washington, went online with a geothermal cooling system.

The heated air of Olympus computer servers are transferred to a large radiator attached to the rear of the computer cabinet. This 41°C or 106°F air heats the water inside the radiator, which is then pumped to a heat exchanger that transfers the heat to a geothermal source. Simultaneously, the radiator water is cooled, which is returned to Olympus. This water enters the computer room at 18°C (64°F).

“Our geothermal source is made of four wells that bring up 16°C (61°F) groundwater. This water absorbs heat at a plate-frame heat exchanger and is then injected back underground through four more wells downstream from the uptake wells,” said Ralph Wescott, data services manager at PNNL

“Transferring heat from Olympus to a geothermal source of water maintains a steady temperature all-year-round and allows us to avoid mechanical chillers (similar to car air conditioners), which are expensive, failure prone, and consume large amounts of electricity,” said Wescott.

“To my knowledge, our latest implementation for Olympus is the first time that geothermal cooling has been accomplished without mechanical chillers,” he said.  

Their future goal is to completely eliminate the air flow and get cooling fluid closer to the servers. “Minimizing the heat exchange steps that go from a geothermal source to hot internal computer components will generate the best energy-efficient cooling,” said Wescott.

2) Hot water:

The capacity of water to capture heat is about 4,000 times higher than the equivalent amount of air.

The process involves continually pumping water via pipes in closed-loop or capillary-like channels to the CPU, in order to dissipate the heat generated as electrical transistors continually switch on and off.

Diagram of Aquasar hot-water cooling system.

This diagram shows how the oxymoronic hot-water cooling system can be used to heat adjacent buildings. (Click image to enlarge.) Image courtesy IBM Zurich Research Laboratory.

In 2010, the first hot water-cooled HPC was switched on. The system, called Aquasar, was built in partnership between IBM and the Swiss Federal Institute of Technology (ETH Zurich). It’s a thermal power plant operating at the Department of Mechanical and Process Engineering at ETH Zurich.

The Aquasar module consists of special water-cooled IBM servers. The warm water keeps the microchips at an optimal temperature of 60°C (140 °F), which is well under the processor overheating threshold of 85°C (185 °F). The HPC system runs at six teraflops and consumes 20 kilowatts of power.

“We’ve found electronics work better at high temperatures. In my opinion, water is a very efficient way to cool high-performance computers. Of course there are risks of leakage because water is an electrical conductor, but we have redundancies in place. Since we switched on our system, it’s been running flawlessly. By using hot water, you can halve the energy usage,” said Dimos Poulikakos, head of the laboratory of thermodynamics in new technologies, ETH Zurich.

“Our HPC system is connected to the water system of ETH Zurich. Excess heat from our computers provides part of the hot water they need for taps and heating for buildings, e.g. we take 50°C (122°F) of heat and give them back 60°C (140°F),” he said.   

This summer, Aquasar's big brother, SuperMUC, a petaflop high-performance computer, will be unveiled at the Leibniz Supercomputing Center (LRZ) in Garching, Germany.

3) Liquid metal:

Liquid metal has the potential to cool chips at the limit of nanometers (billionths of a meter). “It can absorb and transfer large quantities of heat, and is very dense, so large volumes of the metal are not required,” said Carlos Maidana a physicist at CERN.

Trying to explain quantum mechanics requires a teaching degree from Hogwarts," said Thomas Sterling of Indiana University.

In a cooling system, liquid metal is delivered to and from the CPU by an electromagnetic pump. The pump is useful because it has no moving parts, and emits no noise or vibrations. “This electromagnetic pump makes use of the conducting capacities of liquid metals to induce electromagnetic fields inside the liquid, which generates a Lorentz Force that ‘pumps’ the liquid metal from one point to another without the need of moving parts,” Maidana said.

In 2007, researchers from the Technical Institute of Physics and Chemistry, at the

Chinese Academy of Sciences, Beijing, China, demonstrated the first liquid metal cooling device without moving solid components to cool a computer chip. They used liquid gallium, a non-toxic, environment-friendly cooling fluid that transfers larger amounts of heat than water.

Further research by the Chinese Academy of Sciences said that using nano-sized particles of aluminium, copper, or carbon nano-tubes combined with liquid gallium would create the most efficient heat-conductive coolant yet.

4) Liquid submersion:

A computer’s components are submerged into a thermally conductive, heat-transferring liquid, such as water, oils, or specially created fluids.

This demonstration video shows Green Revolution Cooling's liquid submersion in action. One of the best examples is when a server rack is lifted out of the cabinet dripping with mineral oil. Image courtesy Green Revolution Cooling.

The liquid must also have a low electrical conductivity – or be electrically inert – so that it does not interfere with the normal operation of a computer's components.

“Instead of the server being cooled by high-velocity air, submersion cooling surrounds all the heat-generating components in the server, with a non-conductive fluid that has 1,200 times the heat capacity (by volume) than air,” said Christiaan Best, CEO of Green Revolution Cooling, a US commercial company that cools computers at the Texas Advanced Computing Center, at the University of Texas at Austin, with a dielectric (non-conductive) fluid submersion coolant called GreenDEF mineral oil.   

“It’s a blend of highly refined mineral oils, is colorless, odorless, does not evaporate, and is safe for human exposure and even consumption,” said Best.

“The coolant is so much better at absorbing heat; we can maintain the coolant temperature considerably above ambient, while still keeping the server components 10 to 15°C (50 to 59 °F) cooler than if they were cooled in a standard 25°C (77°F) air-cooled data center,” he said.

Pumps circulate the coolant around the computer components and the heat is expelled outside.

“Dielectric fluid submersion cooling is less expensive to purchase and operate [compared to water or other cooling technologies], requires little energy, and has the highest power density per server rack of any widely available cooling solution,” Best said.   

5) Phase-change:

This method uses the process of evaporation - changing phase - to remove heat to cool computer processors. Evaporation is a very efficient method to remove heat quickly, just like when we sweat.

The US Pacific Northwest National Laboratory (PNNL) has been using phase-change cooling technology for one of its high-performance computers since 2007. Within PNNL is the Energy Smart Data Center, which uses a new phase-change cooling system for the NW-ICE IBM computing cluster called SprayCool.

Image of a computer motherboard at Pacific Northwest National Laboratory that uses phase change cooling.

This is a close-up of one of NW-ICE's motherboards. It shows how tubing delivers liquid perfluorohexane around to cool the processing chips. Image courtesy Andres Marquez.

“We use a perfluorocarbon (PFC) fluid [an organic compound that is comprised of strongly bonded carbon and fluorine]. PFC transitions from a liquid stage to a gaseous stage. This phase change provides cooling capacity through the absorption of latent heat, which is the heat released by a body during a phase change that occurs without changing the body’s temperature,” said Andres Marquez, technical manager of PNNL’s Energy Smart Data Center.

“Tubing delivers liquid PFC to the system’s processing chips. Exposed to heat, the liquid partially evaporates, after which tubing directs the PFC back to a heat exchanger, where it is allowed to condense into a liquid before it is reapplied to cool the system once more,” he said.

Marquez said phase-change cooling is an attractive proposition. “It could work well if a system is densely packed. It can also be helpful if a system is deployed in a location where air cooling would be too expensive to operate in terms of power consumption or contamination. Compared to direct deionized (non-conductive) water cooling, PFCs can be trimmed easily to the right boil-off temperature, are chemically inert, and stable dielectrics will not short circuit electronics in the event of a leak.”

6) Super-cool:

Certain materials lose all electrical resistance below really low temperatures. In the 1950s, liquid helium was used to cool voltage-to-frequency converters, known as a Josephson Junction, to cool extremely low temperatures. Liquid helium can go as low as –271°C (−456°F), which is colder than most of outer space.

The device exhibited two states: ‘on-or-off’. When these ‘on-or-off’ phases were put in a loop and combined with an inductor, which makes a magnetic field, unique internal magnetic fluxes were created called Josephson Junction loops.

Image of a liquid helium cooling unit within a custom desktop computer.

In 2009, 'overclocker' enthusiasts created a custom liquid helium cooling unit within a desktop PC at a gaming event called Quakeon. Their processor reached a record breaking speed of 7.08 GHz. (Click on the image to watch the video.) Image courtesy AMDUnprocessed.

These loops could be used in digital logic gates for ultra-high-performance computers. These devices could operate at -269°C (452°F), with very low power and at very high speeds.

“This phenomena is a consequence of quantum mechanics and I cannot explain quantum mechanics in laymen terms or any other. Trying to explain quantum mechanics requires a teaching degree from Hogwarts. All kidding aside, there are really cool things you can do at those temperatures,” said Thomas Sterling, a senior scientist from Indiana University.

In the late 1990s, Rapid Single Flux Quantum logic gates were made that operated at super-fast frequencies of over 700 gigahertz. This was considered for Hybrid Technology Multi-Threaded architecture, a first point-design for petaflop HPCs.

“Today researchers are excited about using these techniques for quantum computing. Using quantum mechanics, it may be possible to build a class of computer that could solve certain problems that couldn’t be solved in the lifetime of an individual using conventional HPC. One highly experimental system is called the quantum cellular architecture from the University of Notre Dame,” said Sterling.

He said, “to make such systems requires super-cooled temperatures well below that of the relatively balmy liquid helium; temperatures of milli-kelvins or a hundred times cooler.”

7) Water cooling with a twist:

The Tier 1 grid site CC-IN2P3, of the Worldwide Large Hadron Collider Computing Grid, in Lyon, France, has servers that consume 500 watts of electricity each. This mostly converts into heat.

Graphical display of temperature monitoring tool for a Grid computing center.

The graphical display of CC-IN2P3 temperature monitoring tool. It tracks various temperature probes inside the computing servers. Each individual server is separated by rows. Dark blue represents 15°C temperature and saturated red 50°C. This particular image was taken during an air conditioning failure.
One server (the 'tiny' number in the blue column on the far right, with a red vertical line inbetween) died because of high temperatures. You can also see some hotspots: servers on the top and bottom seem warmer than the ones in the middle. Image courtesy Fabien Wernli.

Unwanted heat is moved by a system of radiators and fans from the front to the back of the servers, into a common enclosed area. Then, heat exchangers between each server cabinet extract the heat using radiators and transfer it to a closed cold-water circuit.

Cold water is pumped from a large, chilled water tank that holds 24 cubic meters (31 cubic yards) of eight °C (46 °F) water. This circuit is kept cool by two large air conditioning (AC) units. 

“The advantage of using this type of cooling is to be able to buy standard rack-mounted servers, which keeps the cost low,” said Fabien Wernli, system administrator at CC-IN2P3.

“The AC units on the roof produce heat just like the back of your refrigerator at around 50°C (122°F). Another water pipe takes advantage of these calories and can deliver free,  warm water to whoever needs it. The first customer will most likely be the new restaurant,” said Wernli.  

“Using waste heat to warm offices isn't common practice yet and can be considered pioneering work. We have complex monitoring tools, which lets us measure temperatures at various points in the cooling circuit at the AC units, pipes, room, servers, etc,” he said.

This is an important point for making computing centers around the World more energy efficient. He said, “energy efficiency has gone from being a challenge to a necessity, as everyone in the HPC business prepares for the exaflop barrier.”

Your rating: None Average: 4.5 (11 votes)

Comments

Post new comment

By submitting this form, you accept the Mollom privacy policy.