Asicminer, a Hong Kong–based bitcoin mining operation, has taken an unorthodox step to gain an advantage over other computing systems running the algorithms that generate the virtual currency. To save money on energy, Asicminer puts its servers in liquid baths to cool them.
The result? Asicminer’s 500-kilowatt computing system uses 97 percent less energy on cooling than if it employed a conventional method. Its custom-made racks hold computers that are submerged in tanks filled with an engineered fluid produced by 3M that won’t damage the machines. The liquid takes up the system’s heat, and inexpensive cooling equipment extracts the heat, ultimately expelling it outside.
The bitcoin-mining facility is on the leading edge of a movement to use liquids to cool data centers. Operators of high-performance supercomputers have long understood that liquids trump air in the physics of heat removal. Because liquids are denser than gases, they are a more efficient medium to transport and remove unwanted heat.
Yet direct liquid cooling is a rarity in the corporate data centers that run bank transactions and the cloud facilities that serve data to smartphones. Data centers consume more than 1 percent of the world’s electricity and about 2 percent of the electricity in the United States. A third or more of that expenditure is for cooling. Given computing’s growing energy cost and environmental footprint, proponents say it’s just a matter of time before some form of liquid cooling wins out.
“Air cooling is such a goofy idea,” says Herb Zien, the CEO of LiquidCool Solutions, in Rochester, Minn., which makes immersion-cooling technology. “The problem is that there’s so much inertia and so much investment in the current system that it’s hard to turn back.”
Indeed, over the years many smart people have perfected the art of moving air around data centers for maximum efficiency. They have a number of techniques to choose from, such as setting up hot and cold aisles, using sensors to monitor conditions, and bringing in cold outdoor air for cooling. And the very idea of pumping fluids, especially water, into an expensive server rack requires a leap of faith that not all technology professionals are willing to take.
“Historically, the thinking has been that electronics and liquids don’t mix,” says Steven Hammond, the director of the Computational Science Center at the National Renewable Energy Laboratory (NREL), in Golden, Colo. “Everybody working in data centers is hydrophobic.” NREL flows water into its server racks to remove heat, eliminating the need for power-hungry air conditioners. In the colder months, pumps circulate the heated water to warm the laboratory building.
The average data center spends more than 30 percent of its energy bill just on cooling, making it a major cost to the Googles and Facebooks of the world. But liquid cooling, particularly immersion cooling or circulating water through server racks, has yet to make a big splash in the cloud. Microsoft, which operates more than a million servers worldwide, is sticking with air cooling because it’s proven and scalable, says Kushagra Vaid, general manager of cloud server engineering at Microsoft. “Cost of scaling is a big factor for Microsoft when considering new types of cooling methods,” Vaid says. “Our scale demands standardized and simplified techniques that are deployable across server environments and geographies.”
One maker of immersion cooling, Green Revolution Cooling, in Austin, Texas, claims that its system, in which servers are placed in a tank filled with mineral oil, is 60 percent cheaper than building and operating a new data center. But it does require a change in how data centers are installed and serviced. For example, server fans need to be removed, and technicians need to wear gloves when swapping out servers.
“[Immersion cooling] is interesting technology, but the real question is, How do I implement it in a data center?” says Ed Turkel, the group manager of high-performance computing marketing at HP. “What does a data center look like with these high-tech bathtubs with servers in them?”
The strongest need for liquid cooling is in situations where a lot of compute power is packed into a small space, experts say. The Asicminer system in Hong Kong, for instance, is compact enough to reside in a high-rise building, taking up one-tenth of the space it would if it were air-cooled.
But the trend in building cloud server farms has been the opposite: Locations are chosen for their cheap and plentiful electricity, which often places them in remote areas with plenty of space. “A lot of companies don’t care one iota about power density. If you’re Google, you just build another data center,” says Phil Tuma from the Electronics Markets Materials division of 3M, which makes high-tech liquids for immersion cooling.
In the future, though, data-center operators may want to place their computing power closer to users. There’s also increasing pressure from environmental groups to lower energy use from cloud data centers. Still, whether liquid cooling will break beyond its niche status remains an open question. “There’s a point where the technology stops being used by early adopters and starts being used by the early majority, and there’s a chasm in between,” says Matt Solomon, the marketing director at Green Revolution Cooling. “We’re just waiting for the domino effect.”
A correction was made to this article on 30 January 2014.
About the Author
Martin LaMonica is a journalist covering energy, policy, and business and is a frequent contributor to Energywise. In the December 2013 issue he reported on how the natural gas boom is setting off a boom in distributed energy generation.