JPL BioSleeve Enables Precise Robot Control Through Hand and Arm Gestures

This sensor-studded cuff from JPL enables intuitive robot control

3 min read

JPL BioSleeve Enables Precise Robot Control Through Hand and Arm Gestures

No matter how capable you make a robot, its effectiveness is limited by how well you can control it. And until we've got this whole general autonomy thing nailed down (better not hold your breath), that means a lot of teleoperation. JPL has been working on a new gesture-based human interface called BioSleeve, which uses a [insert collective noun for sensors here] of EMG sensors, IMUs, and magnetometers to decode hand and arm gestures and map them to an intuitive robot control system.

BioSleeve is a sort of elastic bandage that covers most of your forearm and includes 16 dry contact electromyography sensors plus a pair of inertial measurement units. The sensors can detect movements of the muscles in your arm, which is where the muscles in your hand live, meaning that the BioSleeve can tell when (and how much) you move your arm, wrist, hand, and individual fingers. This enables you to make gestures and have a robot respond to them, much like existing gesture recognition systems, except that since BioSleeve doesn't depend on vision or having your hand in close proximity to a sensor, it's a much easier thing to use for extended periods and in the field (like in cramped spaces like the ISS). Here's a demo:

In order to get the robot to go where the user points, it's assumed that the user's approximate shoulder position relative to the robot is known via other sensors. This shouldn't be a big deal, though, since shoulder position is generally easy to pick out with something like a Kinect.

One big advantage of using EMG is that signals are correlated to muscle force. This means that if clenching your fist signals a robot to drive forward, clenching harder will make the robot drive faster. And even with such complicated variations in signal force, you still have a reasonably large number of gestures that the BioSleeve can accurately recognize. With the full set of gestures (pictured below), the system can differentiate which is which with 96.6 percent accuracy. Using a smaller subset of 11 gestures, the accuracy is bumped up to 99.8 percent, or darn close to perfect. Personally, fewer gestures seems fine to me, especially since I don't seem to be physically capable of making the third one down in column one in the image below without cheating. 

As-is, the BioSleeve relies of off-board computing to function, and it's a bit bulky to wear. The next generation should be more compact, lighter weight, and fully integrated with embedded computers and batteries. Specifically, the final version will offer the following advantages over existing systems:

  • Ease-of-use: The BioSleeve will be conveniently embedded into wearable garments, donned and doffed as part of daily clothes. No extra setup time is required for placement of individual electrodes, fine alignment, etc.
  • Free mobility: There are no external sensors, hand obstructions, or electrode wires imposing constraints on allowable movements.
  • Reliability: Large dense sensor arrays add redundancy and are more immune from movement artifacts (electrode slippage, etc.), with the potential to dramatically improve decoding reliability.
  • Durability: Active channel selection and low power consumption per channel enables operation for long periods of time on in-sleeve batteries.
  • Versatility: The output of the gesture recognition can be mapped into various command libraries for different robotic systems.

If you like the look of this thing but don't have a JPL-sized budget, check out the MYO from Thalmic Labs. It's basically the same underlying technology, except not as complex, and it'll only run you $150. In a strange Canadian coincidence, Thalmic makes its home right next to Clearpath Robotics, and the MYO demo video actually provided the very first (and very brief) look at the Grizzly, about a month before its official unveiling:

"Gesture-Based Robot Control with Variable Autonomy from the JPL BioSleeve," by Michael T. Wolf, Christopher Assad, Matthew T. Vernacchia, Joshua Fromm, and Henna L. Jethani from the Jet Propulsion Laboratory, California Institute of Technology, and Massachusetts Institute of Technology, was presented last week at IEEE International Conference on Robotics and Automation (ICRA) in Karlsruhe, Germany.

The Conversation (0)