Game Developers See Promise in Cloud Computing, but Some Are Skeptical

AMD's Fusion Render Cloud supercomputer has prompted game developers to consider a new platform

3 min read

3 March 2009—Amid all the noise at January’s Consumer Electronics Show, in Las Vegas, video-game developers heard the siren call of a new frontier. Advanced Micro Devices (AMD), the Sunnyvale, Calif.–based microprocessor manufacturer, and the Sherman Oaks, Calif.–based graphics software maker Otoy announced what they claimed will be the world’s fastest graphics supercomputer: the AMD Fusion Render Cloud.

AMD chief executive officer Dirk Meyer promised that the technology, which will begin beta testing by this summer, will ”break the one-petaflop barrier and...process a million compute threads across more than 1000 graphics processors.” When running Otoy’s software, gamers will have an intense payoff: high-definition three-dimensional graphics that can be rendered on a high performance computer and then streamed in real time online. Otoy CEO Jules Urbach, whose company has provided effects for such films as Transformers, says the software will empower developers to create a ”playable video game that has the quality of movies and runs on a Web page.”

But this won’t mean anything unless game developers believe in—and get behind—the innovation. So what do they think? While the notion of a real-time ”holodeck” streaming to an iPhone sounds intriguing, developers are divided on the promise and perils of this new technology.

On one hand, the potential seems great. Corrinne Yu, the Seattle-based principal engine programmer and technical lead for Halo, Microsoft’s blockbuster sci-fi shooter, says, ”The promise of fidelity to lighting and rendering is enormous,” noting that ordinarily onerous processes will now be feasible. For instance, the supercomputer could handle tasks such as radiance transfer, the real-time rendering of complex lighting. ”Radiance transfer can then perform numerous expensive surface visibility operations that would have been too power consumptive for set-top and desktop power supplies,” she says.

Mike Acton, engine director for Insomniac Games, the Burbank, Calif.–based developers of Ratchet & Clank and Resistance, agrees. ”Server-side rendering is a hot topic because it has a certain amount of promise to it,” he says. ”The promise is to render a lot more and render things better than you could do on an individual console or PC.”

David Lightbrown, senior artistic technical director for the Montreal-based developer A2M, says the technology could enable a bold new generation of mobile games in particular. ”This allows iPhones and other thin-client devices to have really high-end graphics without having a big, expensive hot video card in them that draws battery life,” he says.

Despite the promise, though, some developers say that getting this dream out of the clouds is a daunting—if not insurmountable—challenge. ”Unfortunately, there are many pitfalls to overcome,” says Andi Smithers, a 20-year industry veteran who was senior engineer of research and development for Sony Online Entertainment before becoming director of technical development for TC Digital Games, in San Diego. The pitfalls include the limits of the speed of light, he says, and ”inconsistent bandwidth usage and predictability.”

Because responsiveness and action are so central to many games, developers are concerned that the lag between when the distant cloud computer renders a scene and when that scene shows up on a player’s screen will spoil cloud computing’s promise. ”The real-time nature of games means that cloud processing will have too long a latency to help with the biggest bottleneck in real-time game graphics,” says Tobi Saulnier, CEO and founder of 1st Playable Productions, in Troy, N.Y.

Julien Merceron, worldwide chief technical director of the London-based Eidos, creators of Tomb Raider, says latency and limited bandwidth ”will tend to severely limit the type of game that could benefit from the cloud and limit the resolution at which you can play the game.”

In the short term, however, turn-based and puzzle games could reap the rewards because latency wouldn’t necessarily degrade the experience, say developers. Then again, gamers don’t have the same taste for eye candy as in the past. The most successful titles lately are those that emphasize unique play patterns over razzle-dazzle graphics—as evident in hits such as Guitar Hero, Rock Band, and the Wii games. As Jason Della Rocca, executive director of the International Game Developers Association, puts it, ”What consumers and players are generally looking for has nothing to do with fidelity of the graphics anymore.”

About the Author

David Kushner, an IEEE Spectrum contributing editor, is the author of Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture (Random House, 2003) and Jonny Magic and the Card Shark Kids:How a Gang of Geeks Beat the Odds and Stormed Las Vegas (Random House, 2005). He reported on the creation of the hit game Spore in the September 2008 issue.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions