How to Stuff Five Universities Into One Computer Center

A multi-institutional Massachusetts computer center tests out terascale computing—and the social engineering needed to use it

3 min read

24 May 2012—When the Massachusetts Green High-Performance Computing Center launches at the end of this year, its energy efficiency and low carbon emissions (as IEEE Spectrum has detailed) may garner some headlines. But MGHPCC is a trailblazer in one other, perhaps more significant, way.

It puts to a high-stakes test what might be called the “Thanksgiving dinner” approach to academic computing: Put multiple outspoken and diversely opinionated entities under one roof, cross your fingers, and work to ensure that they all get along.

Each of MGHPCC’s five member institutions at the Thanksgiving table—Boston University, Harvard University, MIT, Northeastern University, and the University of Massachusetts system—will by year’s end begin transitioning research computing over to the center. MGHPCC, based in Holyoke, a once-thriving Massachusetts mill town two hours west of Boston, does not provide its own computers but rather the power, networking, and cooling infrastructure needed to remotely and inexpensively host members’ computers.

The US $95 million center’s executive director, John Goodhue, says teraflop-scale computing is blending into the background of a competitive university’s physical plant. Top research universities, he says, are increasingly obligated to make that number-crunching power as simple and seamlessly accessible as possible. So MGHPCC provides what Goodhue calls “ping, power, and pipe” for its members’ terascale computing needs.

“Getting high-performance computing cycles should not be any harder than getting 110 volts out of a wall socket,” Goodhue says of his center’s ultimate mission.

Getting there, of course, is the trick.

MGHPCC’s chief competitor is inertia. Many of the site’s future users, Goodhue says, will be moving their hardware from often-cramped and improvised on-campus locations—from server racks in the storage closet and dedicated labs (whose floor space could be put to better use).

“The nice thing about the computer cluster in the closet in your lab is that you have complete control,” he says. “When you need it, you get it. When you don’t need it, you know where it is.

“So locating the physical hardware [at the center] but making it as if you still have your private machine is the challenge for us. And the technologies that make that possible are now here: high bandwidth, inexpensive networking...and virtualization, building an environment that is totally private to the end user.”

In short, says Francine Berman, professor of computer science at Rensselaer Polytechnic Institute, MGHPCC is not trying to be a skyscraper but rather a city. And as with any city, getting all aspects of the infrastructure right is a crucial hurdle. But, adds Berman, “the first and perhaps biggest challenge is social engineering. As hard as the technical, computational, power, and software problems are, social engineering of the stakeholders is dramatically difficult.”

For instance, she says, different kinds of users and users from different institutions have sometimes conflicting standards of success. The tension at the table, as any veteran of contentious family meals knows, could simmer quietly or boil over.

Coalitions of ultracompetitive member research groups might only find MGHPCC worth supporting over the long run if it can contribute to a portfolio of highly cited papers and breakthrough discoveries. But such a high-powered course could also conflict with educators’ missions, for instance, to diversify the center’s user base by wooing arts and humanities faculty to try terascale computing for their research, or to reach out to students who might never otherwise program or use a supercomputer. “To get a broad user community is really challenging and often does not lead to the big bragging rights that the stakeholders need,” says Berman.

Having been director of the San Diego Supercomputer Center from 2001 to 2009, Berman is in a position to judge MGHPCC’s specs, and to her they look promising. Ultimately, MGHPCC and its cohorts perform more than just a technical duty, Berman says. The facility is the training and testing ground for tomorrow’s mainstream computing applications.

The same massive data centers powering today’s familiar online brands—including Amazon, Apple, eBay, Facebook, and Google—are applying the knowledge and expertise first acquired in the 1990s and early 2000s at cutting-edge centers like MGHPCC.

By 2020, Berman says, high-performance computing will be well into the exascale. And the terascale and petascale innovations developed today in academic and other research computer centers will, like the students currently developing them, move out into the real world. “Many of the applications that start in high-performance computing now, you’ll be seeing on your smartphone at the end of the decade,” she says. “These are important things for us to be doing, and if we do them right, they really do migrate to a much broader place.”

About the Author

Mark Anderson is an IEEE Spectrum contributing editor based in Northampton, Mass. He is the author of The Day the World Discovered the Sun, which was the subject of an interview on Spectrum’s “Techwise Conversations” podcast.

 

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions