20 September 2007—In an age of online shopping, video games, and banking—media and high-tech companies are struggling to keep up with soaring information demand. Big Internet firms such as Yahoo and Google have recently been on a data-center construction binge. But running server-crammed rooms without a glitch and keeping them from reaching boiling temperatures is not cheap. Powering and cooling data centers in the United States cost about US $4.5 billion in electricity bills in 2006, according to a reporton server and data-center energy efficiency that the U.S. Environmental Protection Agency presented to Congress on 2 August.
Some computing heavyweights are already working on a solution. On 14 August, microprocessor maker Advanced Micro Devices convened a panel of IT and power supply vendors in Cambridge, Mass., in an attempt to nail down the hurdles to increasing data-center energy efficiency. Earlier this year AMD also initiated a consortium called the Green Grid to get the IT industry’s brains working together on reducing data-center power consumption. The Green Grid’s members include such big players as Intel—AMD’s archrival—and Hewlett-Packard, Dell, IBM, Microsoft, and Sun Microsystems.
Following the Green Grid’s logic, everyone wins by cutting energy use at data centers. Operators lower their electricity bills, and server makers (and their component suppliers) get to sell new, more-efficient hardware.
The Green Grid is trying to put the focus on improving the efficiency of existing data centers rather than those planned for the near future. Poorly managed, overcrowded centers are energy hogs that consumed 61 billion kilowatt-hours in 2006. That number is expected to double by 2011 if the status quo is maintained, according to the EPA report. Some centers, AMD’s Brent Kerby says, ”are being told by power suppliers that they can’t have any more power.”
Retooling the data center for large global organizations is expected to save at least $100 million over 10 years, says Kenneth Brill, founder and executive director of the Uptime Institute, an IT consulting firm in Santa Fe, N.M., that has been pushing for an industry-wide change in data-center power management for the past decade.
A few simple steps and existing technology—what Brill calls ”low-hanging fruit”—can help data-center managers save money. Thirty percent of servers in a data center are comatose, he says—which means they are running at 5 to 10 percent of their total load. Consolidating their applications on fewer servers and switching the rest off is a no-cost way to cut electricity use. Plus, data-center managers could replace their hardware with energy-efficient servers and processors that are already on the market. Those two steps alone could cut electricity use by 25 percent, according to the EPA report.
A little more effort could mean up to 55 percent less power use, the report says. That would include, for instance, consolidating all servers, not just comatose ones. Companies such as VMware Inc., of Palo Alto, Calif., offer what is called ”virtualization software” to intelligently consolidate applications without affecting performance. Managers could also improve the arrangement of servers and increase air flow in data centers, break down cooling by different areas of server rooms, and adopt state-of-the-art water cooling and energy-efficient lighting systems. ”Fifty percent improvements are there already,” Brill says. ”It’s just a matter of doing it.”
And that’s where the problem lies. At the meeting in Cambridge, executives from several companies complained that they are making energy-efficient equipment but that data-center managers are not adopting it. That is, in part, because of a gap between IT professionals and people keeping accounts. ”IT people running data centers don’t pay the energy bill and oftentimes don’t even see it,” says Joe Loper vice president of policy and research at the nonprofit Alliance to Save Energy, in Washington, D.C.
Another barrier, Loper says, is that IT managers are hesitant to try anything too new, especially given the criticality of data centers. ”If they shut down you’re going to be in big trouble, and so there’s a reluctance to do anything innovative or to make changes that aren’t going to just increase goals to provide IT service.”
But industry efforts such as the Green Grid are a sign that things are changing, says Jonathan Koomey, an Uptime Institute Fellow. Indeed, AMD’s Kerby says one of the Green Grid’s main goals is to raise energy awareness among data-center operators. The Uptime Institute, on the other hand, is pushing to get high-level executives to pay attention.
The government and utility companies are also starting to get involved. The California utility Pacific Gas & Electric Co., in San Francisco, began offering incentives to data centers to consolidate servers late last year. And the EPA’s report is a step toward setting efficiency benchmarks for data-center equipment, similar to the EPA’s Energy Star labels for computers, printers, and other products. Being certified under such a benchmark might be attractive to a data center’s potential customers.
Efficiency benchmarks and labels are crucial, Loper says. The biggest challenge in increasing data-center energy efficiency right now is that operators cannot easily measure how efficient their centers are or how they stack up against others. Data centers consume energy differently depending on the type of servers they run and what their servers are doing, so it is difficult to compare them head to head.
Loper expects to see an Energy Starlike label within the next two years. Strict standards might take much longer, he says. But right now, a little bit of healthy competition to earn the EPA’s gold star might be just what data centers need.
About the Author
Prachi Patel-Predd is a writer and radio reporter based in Pittsburgh.