Human Reflexes Help MIT’s HERMES Rescue Robot Keep Its Footing

MIT’s Hermes is a bipedal robot that uses full-body teleoperation to move with greater agility

11 min read
MIT’s João Ramos wears a teleoperation suit that connects his body to that of HERMES, a bipedal robot designed for disaster response.
Dynamic Duo: MIT’s João Ramos wears a teleoperation suit that connects his body to that of HERMES, a bipedal robot designed for disaster response. Ramos’s reflexes help HERMES keep its footing.
Photo: Bob O’Connor

A sudden, tragic wake-up call: That’s how many roboticists view the Fukushima Daiichi nuclear disaster, caused by the massive earthquake and tsunami that struck Japan in 2011. Reports following the accident described how high levels of radiation foiled workers’ attempts to carry out urgent measures, such as operating pressure valves. It was the perfect mission for a robot, but none in Japan or elsewhere had the capabilities to pull it off. Fukushima forced many of us in the robotics community to realize that we needed to get our technology out of the lab and into the world.

Disaster-response robots have made significant progress since Fukushima. Research groups around the world have demonstrated unmanned ground vehicles that can drive over rubble, robotic snakes that can squeeze through narrow gaps, and drones that can map a site from above. Researchers are also building humanoid robots that can survey the damage and perform critical tasks such as accessing instrumentation panels or transporting first-aid equipment.

But despite the advances, building robots that have the same motor and decision-making skills of emergency workers remains a challenge. Pushing open a heavy door, discharging a fire extinguisher, and other simple but arduous work require a level of coordination that robots have yet to master.

Hermes putting a fire out.To test their teleoperation scheme, the MIT researchers had HERMES perform some tasks that required strength, like using a fire extinguisher.Gif: MIT/IEEE Spectrum

One way of compensating for this limitation is to use teleoperation—having a human operator remotely control the robot, either continuously or during specific tasks, to help it accomplish more than it could on its own.

Teleoperated robots have long been used in industrial, aerospace, and underwater settings. More recently, researchers have experimented with motion-capture systems to transfer a person’s movements to a humanoid robot in real time: You wave your arms and the robot mimics your gestures. For a fully immersive experience, special goggles can let the operator see what the robot sees through its cameras, and a haptic vest and gloves can provide tactile sensations to the operator’s body.

At MIT’s Biomimetic Robotics Lab, our group is pushing the melding of human and machine even further, in hopes of accelerating the development of practical disaster robots. With support from the Defense Advanced Research Projects Agency (DARPA), we are building a telerobotic system that has two parts: a humanoid capable of nimble, dynamic behaviors, and a new kind of two-way human-machine interface that sends your motions to the robot and the robot’s motions to you. So if the robot steps on debris and starts to lose its balance, the operator feels the same instability and instinctively reacts to avoid falling. We then capture that physical response and send it back to the robot, which helps it avoid falling, too. Through this human-robot link, the robot can harness the operator’s innate motor skills and split-second reflexes to keep its footing.

You could say we’re putting a human brain inside the machine.

Future disaster robotswill ideally have a great deal of autonomy. Someday, we hope to be able to send a robot into a burning building to search for victims all on its own, or deploy a robot at a damaged industrial facility and have it locate which valve it needs to shut off. We’re nowhere near that level of capability. Hence the growing interest in teleoperation.

HERMES cutting with an axe.  HERMES swinging an axe.Gif: MIT/IEEE Spectrum

The DARPA Robotics Challenge in the United States and Japan’s ImPACT Tough Robotics Challenge are among the recent efforts that have demonstrated the possibilities of teleoperation. One reason to have humans in the loop is the unpredictable nature of a disaster scene. Navigating these chaotic environments requires a high degree of adaptability that current artificial-intelligence algorithms can’t yet achieve.

For example, if an autonomous robot encounters a door handle but can’t find a match in its database of door handles, the mission fails. If the robot gets its arm stuck and doesn’t know how to free itself, the mission fails. Humans, on the other hand, can readily deal with such situations: We can adapt and learn on the fly, and we do that on a daily basis. We can identify variations in the shapes of objects, cope with poor visibility, and even figure out how to use a new tool on the spot.

The same goes for our motor skills. Consider running with a heavy backpack. You may run slower or not as far as you would without the extra weight, but you can still carry out the task. Our bodies can adapt to new dynamics with surprising ease.

The teleoperation system we are developing is not designed to replace the autonomous controllers that legged robots use to self-balance and perform other tasks. We’re still equipping our robots with as much autonomy as we can. But by coupling the robot to a human, we take advantage of the best of both worlds: robot endurance and strength in addition to human versatility and perception.

Our lab has long explored how biological systems can inspire the design of better machines. A particular limitation of existing robots is their inability to perform what we call power manipulation—strenuous feats like knocking a chunk of concrete out of the way or swinging an axe into a door. Most robots are designed for more delicate and precise motions and gentle contact.

We designed our humanoid robot, called HERMES (for Highly Efficient Robotic Mechanisms and Electromechanical System), specifically for this type of heavy manipulation. The robot is relatively light—weighing in at 45 kilograms—and yet strong and robust. Its body is about 90 percent of the size of an average human, which is big enough to allow it to naturally maneuver in human environments.

Instead of using regular DC motors, we built custom actuators to power HERMES’s joints, drawing on years of experience with our Cheetah platform, a quadruped robot capable of explosive motions such as sprinting and jumping. The actuators include brushless DC motors coupled to a planetary gearbox—so called because its three “planet” gears revolve around a “sun” gear—and they can generate a massive amount of torque for their weight. The robot’s shoulders and hips are actuated directly while its knees and elbows are driven by metal bars connected to the actuators. This makes HERMES less rigid than other humanoids, able to absorb mechanical shocks without its gears shattering to pieces.

The first time we powered HERMES on, it was still just a pair of legs. The robot couldn’t even stand on its own, so we suspended it from a harness. As a simple test, we programmed its left leg to kick. We grabbed the first thing we found lying around the lab—a plastic trash can—and placed it in front of the robot. It was satisfying to see HERMES kick the trash can across the room.

The human-machine interface we built for controlling HERMES is different from conventional ones in that it relies on the operator’s reflexes to improve the robot’s stability. We call it the balance-feedback interface, or BFI.

The BFI took months and multiple iterations to develop. The initial concept had some resemblance to that of the full-body virtual-reality suits featured in the 2018 Steven Spielberg movie Ready Player One. That design never left the drawing board. We discovered that physically tracking and moving a person’s body—with more than 200 bones and 600 muscles—isn’t a straightforward task, and so we decided to start with a simpler system.

Photo of the engineers making HERMES pouring water

Gif of HERMES pouring in a cupHERMES perform some tasks that required dexterity, like pouring water into a cup.Photo: MIT; Gif: MIT/IEEE Spectrum

To work with HERMES, the operator stands on a square platform, about 90 centimeters on a side. Load cells measure the forces on the platform’s surface, so we know where the operator’s feet are pushing down. A set of linkages attaches to the operator’s limbs and waist (the human body’s center of mass, basically) and uses rotary encoders to accurately measure displacements to within less than a centimeter. But some of the linkages aren’t just for sensing: They also have motors in them, to apply forces and torques to the operator’s torso. If you strap yourself to the BFI, those linkages can apply up to 80 newtons of force to your body, which is enough to give you a good shove.

We set up two separate computers for controlling HERMES and the BFI. Each computer runs its own control loop, but the two sides are constantly exchanging data. In the beginning of each loop, HERMES gathers data about its posture and compares it with data received from the BFI about the operator’s posture. Based on how the data differs, the robot adjusts its actuators and then immediately sends the new posture data to the BFI. The BFI then carries out a similar control loop to adjust the operator’s posture. This process repeats 1,000 times per second.

To enable the two sides to operate at such fast rates, we had to condense the information they share. For example, rather than sending a detailed representation of the operator’s posture, the BFI sends only the position of the person’s center of mass and the relative position of each hand and foot. The robot’s computer then scales these measurements proportionally to the dimensions of HERMES, which reproduces that reference posture. As in any other two-way teleoperation loop, this coupling may cause oscillations or instability. We minimized that by fine-tuning the scaling parameters that map the postures of the human and the robot.

To test the BFI, one of us (Ramos) volunteered to be the operator. After all, if you’ve designed the core parts of the system, you’re probably best equipped to debug it.

Photo of HERMES holding an axe.  To The Rescue: Equipped with powerful motors, HERMES is a highly dynamic robot.Photo: Bob O’Connor

In one of the first experiments, we tested an early balancing algorithm for HERMES to see how human and robot would behave when coupled together. In the test, one of the researchers used a rubber mallet to hit HERMES on its upper body. With every strike, the BFI exerted a similar jolt on Ramos, who reflexively shifted his body to regain balance, causing the robot to also catch itself.

Up to this point, HERMES was still just a pair of legs and a torso, but we eventually completed the rest of its body. We built arms that use the same actuators as those used by the legs and hands, made out of 3D-printed parts reinforced with carbon fiber. The head features a stereo camera, for streaming video to a headset worn by the operator. We also added a hard hat, just because.

In another round of experiments, we had HERMES punch through drywall, swing an axe against a board, and, with oversight from the local fire department, put out a controlled blaze using a fire extinguisher. Disaster robots will need more than just brute force, though, so HERMES and Ramos also performed tasks that require more dexterity, like pouring water from a jug into a cup.

In each case, as the operator simulated performing the task while strapped to the BFI, we observed how well the robot mirrored those actions. We also looked at the scenarios in which the operator’s reactions could help the robot the most. When HERMES punched the drywall, for instance, its torso rebounded backward. Almost immediately, a corresponding force pushed the operator, who reflexively leaned forward, helping HERMES to adjust its posture.

We were ready for more tests, but we realized that HERMES is too big and powerful for many of the experiments we wanted to do. Although a human-scale machine allows you to carry out realistic tasks, it is also time-consuming to move, and it requires lots of safety precautions—it’s wielding an axe! Trying more dynamic behaviors, or even walking, proved difficult. We decided HERMES needed a little sibling.

Little HERMES is a scaled-down version of HERMES. Like its big brother, it uses custom high-torque actuators, which are mounted closer to the body rather than on the legs. This configuration allows the legs to swing much faster. For a more compact design, we reduced the number of axes of motion—or degrees of freedom, in robotic parlance—from six to three per limb, and we replaced the original two-toed feet with simple rubber spheres, each having a three-axis force sensor tucked inside.

Jo\u00e3o Ramos and his colleagues are building a scaled-down version of HERMES called Little HERMES.Puppet Master: João Ramos and his colleagues are building a scaled-down version of HERMES called Little HERMES. The robot will feature autonomous functions but will also rely on a human teleoperator to accomplish more than it could on its own.Photo: Bob O’Connor

Connecting the BFI to Little HERMES required adjustments. There’s a big difference in scale between a human adult and this smaller robot, and when we tried to link their movements directly—mapping the position of the human’s knees and the robot’s knees, and so forth—it resulted in jerky motion. We needed a different mathematical model to mediate between the two systems. The model we came up with tracks parameters such as ground contact forces and the operator’s center of mass. It captures a sort of “outline” of the operator’s intended motion, which Little HERMES is able to execute.

In one experiment, we had the operator step in place, slowly at first and then faster. We were happy to see Little HERMES marching in just the same way. When the operator jumped, Little HERMES jumped too.

In a sequence of photos we took, you can see both human and robot in midair for a brief instant. We also placed pieces of wood underneath the robot’s feet as obstacles, and the robot’s controller was able to keep the robot from falling.

Much of this was still preliminary work, and Little HERMES wasn’t freely standing or able to walk around. A supporting pole attached to its back prevented it from tipping forward. At some point, we’d like to develop the robot further and set it loose to amble around the lab and perhaps even outdoors, as we’ve done with Cheetah and Mini Cheetah (yes, it too has a little sibling).

Our next stepsinclude addressing a host of challenges. One of them is the mental fatigue that an operator experiences after using the BFI for extended periods of time or for tasks that require a lot of concentration. Our experiments suggest that when you have to command not only your own body but also a machine’s, your brain tires quickly. The effect is especially pronounced for fine-manipulation tasks, such as pouring water into a cup. After repeating the experiment three times in a row, the operator had to take a break.

Our goal is to build an agile quadruped that transforms into a skilled bipedal robot.

The solution here is to have the operator and the machine share the responsibility for stabilizing the robot. If HERMES is performing a task that requires more conscious effort from the operator, the operator doesn’t also have to keep the robot balanced; an autonomous controller can take over the robot’s balance. One way to identify such scenarios is by tracking the operator’s gaze. A fixed stare indicates a mentally taxing task, and in such cases, the autonomous balancing mode should kick in.

Another hurdle for our system—or any teleoperated system, for that matter—is transmission delays. Imagine you’re controlling a robot remotely and there’s a 1-second lag between your commands and the robot’s responses. You may still be able to teleoperate it, but if the delay gets any bigger, you may start to feel disoriented and unable to perform manipulations. Our plan is to rely on new wireless technologies like 5G, which offer both low latency and high throughput transmissions.

Finally, there are some bold new designs we’d like to explore. Although HERMES and Little HERMES are two-legged robots, there’s no real reason why a rescue robot should be bipedal. One promising possibility is a machine that would walk on four legs to traverse challenging terrain and then stand up on its hind limbs to perform manipulation tasks, much as some primate species do.

Our long-term vision is to merge the legged robots we’ve developed in our lab: Cheetah and HERMES. The result would be a fast-moving quadruped robot that could autonomously run into a disaster site, then transform into a bipedal robot that could borrow the skills and reflexes of an experienced first responder. We believe technologies like these will help emergency workers do their jobs better and more safely.

One day, hopefully soon, robots will be ready when called upon.

This article appears in the June 2019 print issue as “The Brain in the Machine.”

About the Authors

João Ramos is a postdoctoral associate at MIT’s Biomimetic Robotics Lab. He has a Ph.D. in mechanical engineering from MIT and B.S. and M.S. degrees, also in mechanical engineering, from the Pontifical Catholic University of Rio de Janeiro. In August, he’ll join the Department of Mechanical Science and Engineering at the University of Illinois at Urbana-Champaign as an assistant professor.

Albert Wang is a Ph.D. candidate at MIT’s Biomimetic Robotics Lab. He has a B.S. and M.S. in mechanical engineering from MIT.

Sangbae Kim is the director of MIT’s Biomimetic Robotics Lab and an associate professor of mechanical engineering at MIT. His research focuses on bio-inspired robot design, robotic actuators, and biomechanics. He has M.S. and Ph.D. degrees from Stanford University.

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions