New Tech Keeps Data Centers Cool in Warm Climates

Hot water from some computers drives the refrigeration of others

5 min read
New Tech Keeps Data Centers Cool in Warm Climates
Photo: IBM

06NWiDataCoolmasterSuper Cool: Germany’s SuperMUC supercomputer, ranked 9th in the world, uses water to transfer heat. A similar system, iDataCool, uses the heated water to run a refrigerator and cool other computers.Photo: IBM

In a small town at the edge of the Arctic Circle, Facebook has built its first-ever data center outside the United States. Luleå, Sweden—a city dubbed the Node Pole because of the vast amounts of data traffic it will soon be fielding—has all the features to make it an ideal spot for the social network behemoth to extend its operations. It has space: Facebook’s facility occupies the equivalent of five soccer fields. It has cheap electricity: The powerful Luleå River, with its hydroelectric dams, generates an abundance of energy that the center can draw from. And it is constantly frigid: The average temperature in Luleå is a brisk 1.3°C, which makes it cheaper to cool the furnacelike heat that such a facility generates.

One problem with a tech company launching a big data-storage space in the Arctic, though, is that although temperature conditions are optimal, talented engineers don’t necessarily want to uproot their lives to work there, says Axel Auweter, a research associate at the Leibniz Supercomputing Centre, in Germany, who worked on a new cooling system for the SuperMUC, a machine that was ranked No. 9 last week on the Top500 list of the world’s mightiest supercomputers. “I think we offer a solution that could actually help those who are still hesitating to move to those northern regions where it’s quite dark in the winter and maybe not so many skilled people want to work,” he says.

Auweter’s group is one of several that are creating new kinds of cooling solutions for supercomputers and large data centers that go beyond regulating temperature with just the available air. His group’s solution uses water instead. One take on that technology was presented last week at the International Supercomputing Conference, in Leipzig, Germany. There, a team from the University of Regensburg, also in Germany, presented results from over a year of operation of a new liquid-cooling technology meant for large-scale, piping-hot computer clusters. The solution could chill the world’s largest data centers by reusing some of the heat the high-performance computer clusters produce.

Basically, that waste heat can be used to drive a special kind of refrigerator called an adsorption chiller. The chillers produce cold water, which is then used to cool other computers in the data center or provide air-conditioning for the site’s human workers. The researchers have christened their system iDataCool.

“The electrical energy that goes into the computer is converted into heat, essentially, and if you could reuse that heat somehow, then you recover a large part of the energy and the cost that you put into this,” says Tilo Wettig, a physics professor at the University of Regensburg who was part of the team that engineered iDataCool.

Most computers, from netbooks to high-performance systems, are cooled by air, which has obvious advantages. There’s no shortage of air, and air-cooling systems are simple to build. Liquid-cooled systems, however, are more expensive and must be carefully considered. They require more maintenance, and water hitting the circuits could short them and cause irreparable damage. All this makes it seem as though water-cooled solutions would be much riskier, but proponents of these platforms say they’ve been thoroughly tried and tested. “We’ve learned a lot during these projects, and there’s a multitude of means to ensure that the operation of such [a] system is safe in terms of not breaking things,” says Auweter. With careful testing of materials and some smart engineering, Auweter asserts, operations can be made safe with less maintenance.

Those are precisely the considerations that went into devising iDataCool. The system is essentially a computer cluster that functions as a research platform for the cooling of IT equipment. Because the project was carried out in collaboration with the IBM Research & Development Laboratory in Böblingen, Germany, researchers based it on a server system from IBM called iDataPlex, which is specially designed for high-performance computing in data centers.

iDataCool consists of three racks with 72 compute servers each. Inside each node lies either two four-core 2.53-gigahertz Intel Xeon E5630 processors or two six-core 2.40-GHz Intel Xeon E5645 processors—some truly impressive computing muscle, further bolstered by a generous 24 gigabytes of RAM. Originally, the iDataPlex system was cooled entirely by air. To build iDataCool, scientists completely reengineered the cooling system but left all other components untouched.

“We did not make any changes to firmware or the CPU itself,” says Nils Meyer, a research associate and doctoral student at the University of Regensburg who also worked on iDataCool. “However, we removed all the things that are usually there for air cooling—the fans and the heat sinks. These were all replaced with our own design, which allows you to cool the CPU, the memory, and all the components that get quite hot…and cool these directly with water.”

iDataCool employs a scheme called hot-water cooling—the same technique in use at the Leibniz Supercomputing Centre’s SuperMUC. (iDataCool actually acted as a final test bed before the system was installed at SuperMUC.) In the SuperMUC, heat is conducted away from the machines, carried in microchannels that contain water, and then brought to human-occupied areas of a building. “This is a very sensible thing to do,” says Regensburg’s Wettig. “But during the summer, you don’t need to heat that much, and in certain [hot] climates, you don’t need to heat at all.... In that case, you would even need to spend some energy to remove the heat.”

Instead of heating humans, explains Wettig, the Regensburg system uses the heat to power an adsorption chiller and provide them—or other computers in the data center—with extra frostiness. iDataCool’s key innovation of iDataCool is its low-cost, custom-designed copper heat sink, through which the water flows, drawing away heat. The processor heat sink is hard-soldered to a copper pipeline with flowing water, which is attached to heat sinks affixed to other components in the system, such as memory and voltage controllers.

The energy from all this hot water goes to drive the adsorption chiller, which operates efficiently when the water entering is above 65°C. The chiller produces cold water that feeds a separate cooling loop in the data center. External coolers that would be attached to this loop in other setups could thus be completely replaced by the adsorption chiller.

Though the adsorption chiller works best at temperatures above 65°C, the processors themselves consume less power at lower temperatures. So in operation, the processors consumed 5 percent more power than is ideal. However, that extra wattage was more than offset by the 90 percent improvement in the adsorption chiller’s efficiency. With the system in place, the energy recovered from iDataCool was about 25 percent of what would have been lost.

Auweter has seen the iDataCool firsthand and thinks it is a well-conceived system. “To be honest, I think...they’re actually really addressing all the important issues of higher-temperature cooling,” he says.

Wettig says the team did not encounter any major problems after installing the system, but he did admit there was one flaw that could be fixed. “We lose a lot of heat to the air of the computing center” because the server racks are poorly insulated, he says. According to the researchers’ projections, with better thermal insulation about 50 percent of the total waste energy could be recovered.

Hot-water cooling and energy reuse is still considered a novel solution for high-performance computers, but Auweter genuinely believes that this is the best option out there, especially for supercomputers installed in warmer climes. The biggest deterrent against adoption, he says, is the fear of having to put up a little more money for a system that still isn’t widely used. “When it comes to spending lots of money, people tend to get very conservative and stick to proven technologies. In our case, that would mean they stick to air cooling,” says Auweter. “But I think it’s just crucial to have reference to setups of systems like iDataCool, so those people can think beyond the current state of the art or just to demonstrate that the risk is not actually as high as they might think.”

The Conversation (0)