Robots With Their Heads in the Clouds

A Google researcher argues that cloud computing could make robots smaller, cheaper, and smarter

4 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Robots With Their Heads in the Clouds

drink

Photo: RoboEarth
Sharing Smarts: A robot, part of the RoboEarth project, taps the cloud to learn how to serve a drink to a patient.

In one of the many famous scenes in The Matrix (1999), the character Trinity learns to fly a helicopter by having a "pilot program" downloaded to her brain.

For us humans, with our offline, nonupgradable meat brains, the possibility of acquiring new skills by connecting our heads to a computer network is still science fiction. Not so for robots.

Several research groups are exploring the idea of robots that rely on cloud-computing infrastructure to access vast amounts of processing power and data. This approach, which some are calling "cloud robotics," would allow robots to off-load compute-intensive tasks like image processing and voice recognition and even download new skills instantly, Matrix-style.

Imagine a robot that finds an object that it's never seen or used before—say, a box of cornflakes. The robot could simply send an image of the box to the cloud and receive the object's name, a 3-D model, nutritional information, and instructions on how to pour it.

Cloud Robotics Projects

• RoboEarth is a European project led by the Eindhoven University of Technology, in the Netherlands, to develop a "World Wide Web for robots," a giant database where robots can share information about objects, environments, and tasks.

• Researchers at Singapore's ASORO (A-Star Social Robotics Laboratory) have built a cloud-computing infrastructure that allows robots to generate 3-D maps of their environments much faster than they could with their onboard computers.

• Google engineers developed Android-powered robot software that allows a smartphone to control robots based on platforms like Lego Mindstorms, iRobot Create, and Vex Pro.
Photo: Cellbots

• Researchers at the Laboratory of Analysis and Architecture of Systems, in Toulouse, France, are creating "user manual" repositories for everyday objects to help robots with manipulation tasks.

• At a children's hospital in Italy, Nao humanoid robots, created by the French firm Aldebaran Robotics, will rely on a cloud infrastructure to perform speech recognition, face detection, and other tasks that might help improve their interaction with patients.
Photo: Aldebaran Robotics

For conventional robots, every task—moving a foot, grasping things, recognizing a face—requires a significant amount of processing and preprogrammed information. As a result, sophisticated systems such as humanoid robots need to carry powerful computers and large batteries to power them.

James Kuffner, a professor at Carnegie Mellon University, currently working at Google, described the possibilities of cloud robotics at the IEEE International Conference on Humanoid Robots, in Nashville, this past December. Embracing the cloud could make robots "lighter, cheaper, and smarter," he told the assembled engineers.

According to Kuffner, cloud-enabled robots could offload CPU-heavy tasks to remote servers, relying on smaller and less power-hungry onboard computers. Even more promising, the robots could turn to cloud-based services to improve such capabilities as recognizing people and objects, navigating environments, and operating tools.

The idea of connecting a robot to an external computer is not new. Back in the 1990s, University of Tokyo researchers explored the concept of a "remote brain," physically separating sensors and motors from high-level "reasoning" software. But the amount of computing power a cloud-connected robot has access to is far greater now than what the researchers imagined during the Web's early days.

Kuffner, who is a member of Google's autonomous car project, is now exploring a variety of cloud robotics ideas, including "using small mobile devices as Net-enabled brains for robots," he told IEEE Spectrum. Some of his colleagues recently unveiled Android-powered robot software and a small mobile robot dubbed the Cellbot. The software allows an Android phone to control robots based on platforms like Lego Mindstorms, iRobot Create, and Vex Pro.

But cloud robotics isn't limited to smartphone robots. It could apply to any kind of robot, large or small, humanoid or not. Eventually, some of these robots could become more standardized, and sharing applications would be easier. Then, Kuffner suggested, something even more interesting could emerge: an app store for robots.

The app paradigm is one of the crucial factors behind the success of smartphones. What could apps do for robotics? It's too early to say. But at the Nashville gathering, roboticists received Kuffner's idea with enthusiasm.

"The next generation of robots needs to understand not only the environment they are in but also what objects exist and how to operate them," says Kazuhito Yokoi, head of the Humanoid Research Group at Japan's National Institute of Advanced Industrial Science and Technology. "Cloud robotics could make that possible by expanding a robot's knowledge beyond its physical body."

"Coupling robotics and distributed computing could bring about big changes in robot autonomy," says Jean-Paul Laumond, director of research at France's Laboratory of Analysis and Architecture of Systems, in Toulouse. He's not surprised to see Google, which develops core cloud technologies and services, pushing the idea of cloud robotics.

But Laumond and others note that the cloud is not the solution to all of robotics' difficulties. In particular, controlling a robot's motion—which relies heavily on sensors and feedback—won't benefit much from the cloud. "Tasks that involve real-time execution require onboard processing," he says.

And there are other challenges. As any Net user knows, cloud-based applications can get slow or simply become unavailable. If a robot relies too much on the cloud, a hitch in the network could leave it "brainless."

Still, Kuffner is optimistic. He envisions a future when robots will feed data into a "knowledge database," where they'll share their interactions with the world and learn about new objects, places, and behaviors. Maybe they'll even be able to download a helicopter-pilot program.

To Probe Further

A version of this article appeared in IEEE Spectrum's Automaton blog in January.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions