Cooling the Hottest Rooms

A creative renovation of Stanford University's data centers saves money and energy

Loading the podcast player...

This podcast is part of the Sustainable Design radio program, a collaboration between IEEE Spectrum and the National Science Foundation.

In California, Stanford University’s server room just got a makeover—to save money and energy.

Susan Hassler: Ancient pueblos of the American Southwest and historic bungalows in California have one thing in common: They are built to harness the natural flow of air. Shaded areas cool air as it flows indoors. And as hot air rises up towards the ceiling, it can be funneled outside, creating a natural current through the structure. How can data centers make use of these old-fangled designs? As Amy Coombs reports from Stanford University, something as simple as controlling the flow of air can cool down the hottest rooms.

Amy Coombs: Data centers across the U.S.A. consume more energy than all the nation’s color televisions combined—that’s somewhere between 50 and 70 billion kilowatt-hours of energy every year.

Amy Coombs: Touring Stanford University’s data center, I step into a room of servers that never sleep, and the cooling fans roar in the background.

Amy Coombs: Stanford’s director of sustainable IT, Joyce Dickerson, just spent months remodeling the university’s computing center.

Joyce Dickerson: So you may notice it seems a little louder in here. That’s because this is our high performance compute center. These are the computers used for a lot of the intensive research on campus—genomics, atmospheric, volcanoes, stem cells. So we’ve got sections that run very, very hot.

Amy Coombs: I walk down the back of a server row. These 6-foot high racks of computers can generate 150-degree temperatures, and I can feel the warmth coming off the machines. But then I see a sliding door built across the center of the isle, where the servers face inward. Dickerson slides the door open, and cool air spills out. It now feels like the cold-food aisle of the grocery store.

Joyce Dickerson: So you will see over here what we’ve done is you have a row of the fronts of the computer—we call that the cold aisle, and that’s what we want to keep between 70 and 80 degrees. We have a door that goes across the sides of the cold aisle so all the cold air we push into the cold aisle doesn’t spill out the sides and creep into the hot aisle, where it’s coming out about 30 degrees warmer.

Amy Coombs: Beneath our feet lies a ducting system where fans pull in air from outside, chill it, and blow it towards the servers. Dickerson says that this conventional raised floor used to be filled with wiring and sometimes even canned drinks stored by engineers. This blocked the airflow and wasted energy. She opens a large metal floor grate to show me the new, more efficient duct system.  

Joyce Dickerson: So here, the floor it’s fairly empty. There is 18 inches of empty space, and that’s so the air can move around freely. We’ve moved most of the cabling and networking up to the top above us because you don’t want that getting in the way of the air flowing around down here. Before people thought about it, they would stick just about anything down here. There are stories of people finding six-packs of beverages and all sorts of things down here. So cleaning out under the floor is critical to keep the air flowing so these systems work correctly.   

Amy Coombs: Last winter, Hewlett-Packard unveiled a similar design in their Wynyard Data Center in Newcastle, England. That building’s entire ground floor is used to conduct air. Giant fans line the side of the building, bringing in sea air so cool that it doesn’t need to be air-conditioned. According to environmental engineer Ed Kettler, it’s one of the largest and most efficient raised floors in the world.

Ed Kettler: So instead of a typical computer room, it may be a 2- or 3-foot raised floor. What this is is a whole ceiling—floor to ceiling of about 15 feet. This allows us to use larger fans that are more efficient to move a lot of air volume at a lower pressure, but to bring that air in and get it to go where we want it to go and cool efficiently without using a lot of energy to do it.

Amy Coombs: This is the sound of the fans humming. Considering that the wall is lined with them, it’s not a lot of noise. Upstairs, things are just as quiet. The heat generated by the servers rises into groves and vents in the ceiling, where fans guide the warmth outside. This cooling strategy has worked so well that the auxiliary coolers installed for backup have never been used—even during the warmest summer days. Kettler says this smart design makes the data center efficient. 

Ed Kettler: We are on target for saving the 40 percent on power we knew we would, so we are more efficient than most buildings. You might want to think about it as a typical data center is 50 percent efficient. This data center is 83 percent efficient with power.

Amy Coombs: The data center is also painted with low-toxic paint. Rainwater is harvested from the rooftop for use in the bathrooms and gardens. And the lighting turns on one row at a time, tracking your progress as you walk down the hall, just as it does on Star Trek.

Amy Coombs: Back at Stanford, Joyce Dickerson says you can also make a difference for the environment with simple tactics that any data center can use. She recently installed scooped tiles that whisk cool air upwards towards the top of the server racks. 

Joyce Dickerson: We found these floor tiles that have little scoops on them, that they actually catch the air that is going under the floor and shoot it up to the top. And we found it dropped the top temperature of the rack by about 15 degrees just by putting one of these passive tiles in place. 

Amy Coombs: To demonstrate, Dickerson puts a piece of plastic on top of the scoop, and we watch the air blow it high off the ground. 

Joyce Dickerson: So you will see here when I take a piece of plastic and put it on top of that one, it’s flying up about a quarter of the way up the rack. Just the volume of air is a lot stronger.

Amy Coombs: Dickerson’s retrofit shows that a lot can be accomplished with very small changes. Buildings that contain data centers often can’t get green-building certifications because they use so much energy. Yet now, after its retrofit, Stanford is saving about [US] $300 000 a year in energy costs. That’s pretty cool savings for channeling out a lot of hot air.

I’m Amy Coombs.

This podcast is part of the Sustainable Design radio program, a collaboration between IEEE Spectrum and the National Science Foundation.