HyQ Steps Across Gaps Despite Getting Yanked Around

IIT's quadruped has a new footstep planner that is robust against shoves and gaps

3 min read
HyQ
Photo: IIT

If your robotics lab has a quadruped, it’s become almost a requirement that you post a video of the robot not falling over when walking across some kind of particularly challenging surface. And quadrupeds are getting quite good at keeping their feet, even while negotiating uneven terrain like steps or rubble. One way to do this is without any visual perception at all, simply reacting to obstacles “blindly” by positioning legs and feet to keep the body of the robot upright and moving in the right direction. This can work for terrain that’s continuous, but when you start looking at more dangerous situations like gaps that a robot’s leg could get stuck in, being able to use vision to plan a safe path becomes necessary.

Vision, though, is a real bag of worms, kettle of fish, bushel of geese, or whatever your own favorite tricky metaphor is. Adapting foot placement based on visual feedback takes both reliable sensing and the processing power to back it up, but even under the best of circumstances, there’s only so much that an onboard system can usually handle. At the Italian Institute of Technology, roboticists have used a convolutional neural network to reduce the time that it takes for the HyQ quadruped to plan its foot placement by several orders of magnitude, and it can now make dynamic adaptations, allowing it to withstand an extra helping of abuse from its human programmers.

When HyQ is being yanked around in the video above, what it’s showing is that the robot is able to adjust where it’s placing its feet, even after starting to take a step. Most robots plan their steps by saying, “I’m going to put my foot in that spot over there, ready, go!” This works just fine, except when something happens between the time that the robot lifts its foot up in one place and puts it down in another. HyQ’s new controller allows it to replan almost continuously, enabling adjustments on the fly whether it’s in the middle of a step or not, making it much more robust to external disturbances, whether caused by slippery surfaces, mistakes in foot placement, or shoves from human meanies.

The rest of the video shows an example of a situation in which visual adaptation is critical to the health and happiness of the robot—gap crossing. Without visual feedback, gaps are potentially lethal to those skinny little robot legs. Rather than churn through an entire software stack devoted to interpreting sensor data and calculating optimal foot placement, HyQ instead uses a convolutional neural network trained on a bunch of terrain templates including gaps, bars, rocks, and other nasty things to interpret the 3D map of the area in front of it created by its onboard sensors. The neural network is up to 200 times faster at computation for footstep selection than traditional planning systems, which both enables the continuous planning and opens up the option to do more complex planning in the future, like specifying different gaits or body orientations to make the robot even more adaptable. And while it’s not in the video, the researchers tell us that HyQ can walk across those gaps even while it’s being yanked around.

Octavio Villarreal and Victor Barasuol, from the Dynamic Legged Systems lab at IIT, led by Claudio Semini, will be presenting this work at two IROS workshops on Friday: Development of Agile Robots, and Machine Learning in Robot Motion Planning. If you’re in Madrid, stop by and check it out, and if you’re not, ask yourself whether your commitment to robotics really could be just a bit more serious.

[ IIT ]

The Conversation (0)

The Bionic-Hand Arms Race

The prosthetics industry is too focused on high-tech limbs that are complicated, costly, and often impractical

12 min read
Horizontal
A photograph of a young woman with brown eyes and neck length hair dyed rose gold sits at a white table. In one hand she holds a carbon fiber robotic arm and hand. Her other arm ends near her elbow. Her short sleeve shirt has a pattern on it of illustrated hands.

The author, Britt Young, holding her Ottobock bebionic bionic arm.

Gabriela Hasbun. Makeup: Maria Nguyen for MAC cosmetics; Hair: Joan Laqui for Living Proof
DarkGray

In Jules Verne’s 1865 novel From the Earth to the Moon, members of the fictitious Baltimore Gun Club, all disabled Civil War veterans, restlessly search for a new enemy to conquer. They had spent the war innovating new, deadlier weaponry. By the war’s end, with “not quite one arm between four persons, and exactly two legs between six,” these self-taught amputee-weaponsmiths decide to repurpose their skills toward a new projectile: a rocket ship.

The story of the Baltimore Gun Club propelling themselves to the moon is about the extraordinary masculine power of the veteran, who doesn’t simply “overcome” his disability; he derives power and ambition from it. Their “crutches, wooden legs, artificial arms, steel hooks, caoutchouc [rubber] jaws, silver craniums [and] platinum noses” don’t play leading roles in their personalities—they are merely tools on their bodies. These piecemeal men are unlikely crusaders of invention with an even more unlikely mission. And yet who better to design the next great leap in technology than men remade by technology themselves?

Keep Reading ↓Show less