20 November 2008—Rehabilitation specialists have taken to Nintendo’s Wii game console as a way to help motivate patients during physical therapy and rehabilitation. The latest addition to the Wii-hab phenomenon is perhaps its coolest— Air Guitar Hero . Researchers at Johns Hopkins University have made the popular Guitar Hero game into a tool for amputees who are being fitted with thenext generation of artificial arms. With a few electrodes and some very powerful algorithms, amputees can hit all the notes of Pat Benatar’s ”Hit Me With Your Best Shot” using only the electrical signals from their residual muscles.
The new research, which will be presented this Friday at the IEEE Biomedical Circuits and Systems Conference, in Baltimore, is one component of a program sponsored by the U.S. Defense Advanced Research Projects Agency (DARPA). The Revolutionizing Prosthetics (RP) 2009 project, spread over 30 research institutions worldwide and led by the Johns Hopkins University Applied Physics Laboratory (APL), in Laurel, Md., is developing a mechanical arm that closely mimics the properties of a real limb.
Besides developing two prototype mechanical arms, the project has also pioneered a nerve surgery for controlling the limbs. The nerves that once controlled an amputee’s arm are still intact even after the limb is lost. By rerouting these nerves into the chest muscles and affixing electrodes to pick up the electromyographic (EMG) signals, the RP 2009 researchers were able to use those signals to control a mechanical arm. As a result, the user feels as if he were controlling his own arm.
Though the surgery has worked so far to move an arm with six degrees of freedom, that arm still cannot enable individual finger movement—the ultimate goal of the project.
Dexterous motion of individual fingers poses a tricky software problem. To establish a clear link between mind and machine, the software that translates between EMG signals and the mechanical arm must be trained to understand what the different muscle signals mean. Pattern-recognition algorithms have to be trained by correlating input signal patterns (from muscle contractions) with the intended outputs (opening the mechanical index finger).
The amputee does this training with the help of what’s called a Virtual Integration Environment (VIE)—a virtual-reality training tool in which an onscreen animated arm mimics the user’s intended movements in real time, based on inputs from multiple electrodes attached to the user’s residual arm.
For the training, you sit in front of a screen and obey repeated commands to ”close ring finger,” ”open index finger,” ”close middle finger,” and so on, to get the machine to create an accurate map of your singular myoelectric control mechanisms. What the algorithms and training programs are there to do is to map the subjective experience of the amputee flexing a ring finger to the movements of the virtual arm in the VIE.
That’s a lot harder than it might seem. The trouble comes when the myoelectric impulses of one finger need to be separated from the others. Press your thumb against your middle finger, and you’ll see the problem. If you can’t even easily actuate separate fingers with your native hand, how exactly is an algorithm supposed to figure out an amputee’s intentions from muscles in the chest or upper arm? And, because a movement might be slightly different every time you do it, it needs to be repeated countless times during training for the control algorithm to latch onto the essential signal.
That calibration and training process is as tedious as it is discouraging. ”By 3 p.m.,” says APL engineer Robert Armiger, ”[DARPA volunteer and double amputee] Jesse [Sullivan] has had lunch, and he’s tired, just like you and I are tired after working all day. It’s hard for him to keep his concentration up.” Further motivation was needed.
The researchers had experimented with games before, notably a variation on Pong . You could move the Pong paddle by ”opening and closing” your virtual hand, but that movement is not relevant to opening and closing individual fingers. Though playing the game worked better than simply obeying repetitive commands, Armiger says it was useless for calibrating fine motor control. He also wanted a game with metrics that were a bit more real-world compatible. In the context of prosthetics, that would mean activating muscles to open and close ”fingers” in real time, reacting quickly to a stimulus.
Inspired by Wii-hab, Armiger and colleague Jacob Vogelstein borrowed a colleague’s copy of Guitar Hero and attacked the controller with a soldering iron. They rewired the standard guitar-shaped controller to take instructions from the VIE.
Next they substituted muscle contractions for button presses. In particular, they had to rejigger the inputs. Two-handed gamers normally play by using one hand to press colored ”fret” buttons to correspond to the correct notes while using the other hand to push a ”strum” button in time with the note. Onscreen, these same five colored buttons scroll down the display in time with the notes the players are supposed to hit. To correctly play a note, the player must press the right color fret button and the strum button with the opposite hand.
But Vogelstein and Armiger wanted to use the game to train an amputee. So first they needed to make the game’s controls one-handed. They did that by wiring the two controls together so that an input from a muscle contraction would be read by the VIE as a simultaneous ”fret” and ”strum.”