20 November 2008—Data centers are notorious energy hogs. In 2006, data centers in the United States ran up an electric bill of US $4.5 billion, according to a U.S. Environmental Protection Agency report. But engineers at IBM’s Zurich Research Laboratory think they can cut data centers’ energy consumption in half by cooling computers with water and reusing the dissipated energy to heat nearby homes. This week, at the International Conference for High Performance Computing, Networking, Storage and Analysis, known as SC08, in Austin, Texas, the IBM engineers presented details of a prototype system, which they say will be commercially available in five years.
The engineers have built a data center that reuses 85 percent of the heat it generates while consuming about half the electricity, says Bruno Michel, manager of the advanced thermal packaging group at IBM’s Zurich laboratory. Instead of using air-conditioning or fans, the data center is cooled with water pumped through microchannels within the computers. The water absorbs the heat from the data center and is then pumped out to nearby houses for heating. The occupants pay for the heat. A 10-megawatt data center could produce enough energy to heat 700 homes, says Michel.
Michel says the challenge was finding a water temperature cool enough to keep the data center from overheating but warm enough so that the outgoing water could still be used to heat homes.
”The striking thing that we’re doing is cooling the computer with hot water,” says Michel. The water is pumped into the data center at around 35 °C, he says, and when it leaves the data center, it is 60 °C. ”In climates like Zurich or New York, it’s a value, especially in the winter,” he adds.