IBM Tests Heating Homes With Data-Center Waste Heat
Cooling computers with hot water is a step toward zero-emission data centers
PHOTOS: Left: IBM; Right: IBM Zurich Research Laboratory
20 November 2008—Data centers are notorious energy hogs. In 2006, data centers in the United States ran up an electric bill of US $4.5 billion, according to a U.S. Environmental Protection Agency report. But engineers at IBM’s Zurich Research Laboratory think they can cut data centers’ energy consumption in half by cooling computers with water and reusing the dissipated energy to heat nearby homes. This week, at the International Conference for High Performance Computing, Networking, Storage and Analysis, known as SC08, in Austin, Texas, the IBM engineers presented details of a prototype system, which they say will be commercially available in five years.
The engineers have built a data center that reuses 85 percent of the heat it generates while consuming about half the electricity, says Bruno Michel, manager of the advanced thermal packaging group at IBM’s Zurich laboratory. Instead of using air-conditioning or fans, the data center is cooled with water pumped through microchannels within the computers. The water absorbs the heat from the data center and is then pumped out to nearby houses for heating. The occupants pay for the heat. A 10-megawatt data center could produce enough energy to heat 700 homes, says Michel.
Michel says the challenge was finding a water temperature cool enough to keep the data center from overheating but warm enough so that the outgoing water could still be used to heat homes.
”The striking thing that we’re doing is cooling the computer with hot water,” says Michel. The water is pumped into the data center at around 35 °C, he says, and when it leaves the data center, it is 60 °C. ”In climates like Zurich or New York, it’s a value, especially in the winter,” he adds.
Liquid cooling in general should be much more efficient than traditional air conditioners, says Randy Katz, a data center expert and professor of electrical engineering and computer science at the University of California, Berkeley. ”If you’re trying to air-condition a data center, you don’t really know where the hottest parts of the computer are, so you tend to overcool the room,” says Katz, an IEEE Fellow. A liquid cooling method like the one described by IBM can target the hottest areas of the computer, he says.
IBM’s heat reuse scheme relies on cooling server processors with water.
Data centers have been cooled with liquid before, but usually to a temperature of 45 °C—too low to heat homes. At 45 °C the centers run about 5 to 7 percent more efficiently, Michel says, but the amount of energy wasted by not using the heat outweighs this slight benefit.
IBM’s method is sure to cost more up front than an air-cooled data center, says Michel. He estimates that building a new data center will be 10 percent more expensive. And retrofitting a recently built air-cooled data center would cost about 30 percent of its original price tag. But those extra costs will be recouped in one to two years because of the energy savings, he says.
Katz says potential customers have been resistant to the idea of liquid cooling because of the cost of retrofitting a data center, so much research is focused on more-efficient fans and air-conditioning systems.
But, Katz says, the density of computing equipment in rooms continues to increase, which also increases the amount of heat that has to be removed. ”We are facing significant challenges in terms of how much cooling we can do through traditional air-conditioning,” he says. ”Alternatives are necessary.”