In February, the Defense Advanced Research Projects Agency (DARPA) authorized the next phase of a four-year program to create prosthetic arms that can better emulate natural limbs. They will more closely match the real thing in appearance and function. And the user's ability to feel with them and control them will be vast improvements over anything currently available. The Revolutionizing Prosthetics Program is spread over 30 different organizations, including 10 universities across Canada, Europe, and the United States: the University of New Brunswick, Fredericton, is working on signal processing and pattern recognition for natural arm control; the University of Utah, in Salt Lake City, is working on electrodes for brain implants. The Johns Hopkins University Applied Physics Laboratory, in Laurel, Md., is ”herding the cats,” according to DARPA project manager Colonel Geoffrey Ling, ensuring that these far-flung research partners work together to make the bionic arm a near-term reality. Scientists involved say that this Manhattan Project-like system--on which DARPA has already spent US $30.4 million--is the only way to bring technology this advanced into the world by 2009.
The program was conceived in 2005 to create prosthetic arms that would leapfrog the stagnant hook-and-cable technology that has improved little since World War II. DARPA split the program into two separate projects--one of them a two-year effort that would yield, by 2007, the most sophisticated mechanical arm possible with currently available technologies (that contract went to New Hampshire-based Deka Research and Development Corp.). The international Applied Physics Laboratory, the longer effort, also had a mandate to produce an arm with state-of-the-art mechanics by 2007. Called the Proto-1 this first APL arm, completed in 2007, had approximately eight joints or degrees of freedom.
APL's second-generation prototype, called Proto-2, begun in 2007, has 25 degrees of freedom--almost as much dexterity as a human limb. APL project leader Stuart Harshbarger says the Proto-2, with 15 motors in the hand alone, is capable of unprecedented mechanical agility and shows that we will be able to develop a viable mechanical limb system, including the finger movements of the native limb, over the next two years. But that's not really the point: making a truly bionic arm requires far more than mechanical breakthroughs, better processing power, or longer batteries. None of these enable the prosthetic to respond to the wearer's intent with a natural limb's unthinking grace.
”Think about taking a sip from a can of soda,” Harshbarger says. The complex neural feedback system connecting a native limb to its user lets that user ignore an entire series of complicated steps. The nervous system makes constant automatic adjustments to ensure, for example, that the tilt of the wrist adjusts to compensate for the changing fluid level inside the can. The action requires little to no attention. Not so for the wearer of current prosthetic arms, for whom the act of taking a sip of soda precludes any other activity. The wearer must first consciously direct the arm to extend it to the correct point in space, then switch modes to rotate the wrist into proper position. Then he must open the hand, close it to grasp the soda can (not so weakly as to drop it but not so hard as to crush it), switch modes to bend the elbow to correctly place the can in front of his mouth, rotate the wrist into position, and then concentrate on drinking from the can of soda without spilling it.
All of today's prosthetics rely on the user to control them with a linear series of steps. The best prosthetic arm on the market today allows three degrees of freedom--moving the elbow, rotating the wrist, and opening and closing a claw. You just can't do all three at once. The Deka arm, which should go to clinical trials this year, allows for simultaneous motion of several joints with 18 degrees of freedom. But without a direct neural interface, controlling even the most sophisticated arm still takes the attention and concentration required to control any machine. This is the fundamental difference between the intuitive grace of a native limb and the strained, hesitant movements of a prosthesis. Sensory feedback is a crucial component of mimicking the feedback system that makes a real arm work like a real arm.
Sensory feedback for prosthetics is in the embryonic stages. The best mechanism on the market today consists of a vibrating motor that buzzes against the skin more or less intensely to reflect, for instance, such force factors as grip strength. The DARPA project is gunning for much more than that: researchers want an arm that transmits sensation to the user--pressure, texture, even temperature. The Proto-1 arm already has integrated force sensors in the artificial hand that give the wearer a sensation of feeling. Harshbarger says Proto-2 builds on that breakthrough with 100 sensors that connect the body's natural neural signals to the mechanical prosthetic arm to create a sensory feedback loop: the wearer interacts with an object and the arm feeds back, in real time, where the arm is in space, what object it is touching, whether that object is smooth or rough, how hard the hand is holding it, and what temperature the object is. With that information, the user can react in split-second real time.
As it turns out, the degree of control is directly proportional to the invasiveness of the method. Harshbarger's team is working with four tiers of neural interface. Each tier adds a level of magnitude to the control and sensory capability of the prosthesis--but also a level of magnitude in required surgery.
For simple activities, like grasping a ball, you don't need surgery. The most basic interface (for low-level amputation) uses electrodes taped to the surface of the residual limb's skin. After all, the hand is missing, but not the muscles and nerves that once controlled it. The APL researchers figured out a way to tap the signals still being transmitted to the nonexistent hand from the residual muscles. They used the surface electrodes to detect and amplify those signals. Then, with complex signal-processing and pattern-recognition algorithms, the electrical impulses were translated into instructions for the arm's motors and microprocessors. But while the electrodes can amplify the signal, they can't clean it up: by the time that signal has traveled from the originating muscles through layers of flesh and skin, a lot of noise has been introduced, and some of the impulses may have crossed. So users can open and close the artificial hands at will, but they probably can't move individual fingers the way they want to.
To move individual fingers, which is necessary, for example, to statically hold a key or a pen, you need to access the muscle firings directly. The next level (of invasiveness and control) bypasses these interfering layers of flesh and skin by using small wireless devices called injectable myoelectric sensors (IMES). These tiny, rice grain-like devices are injected into the muscle tissue of the residual arm and work just like the surface electrodes to tap the muscle signals right at the source. But because IMES pick up and transmit a cleaner and higher fidelity signal, they allow finer motor control of the arm. ”Instead of picking up the sum of the signals at the surface,” says Harshbarger, ”we can pick them up at the source, in the muscles that are being excited.” That means, depending on the nature of the injury, a wearer could even control individual fingers. The little devices are perpetually powered by a coil in the prosthetic limb, so they never need batteries. At this point, Harshbarger says, nine have been implanted in trained primates for six months without harmful effects. ”It's going incredibly,” he says. ”These are very low-risk devices, and they have posed no risk to the animals.” But the IMES system depends on the nature of the injury and the availability of implantation sites--which is to say, if you don't have an arm with residual muscles to put IMES devices into, you're out of luck.
For more severe amputations (for example, having both arms removed at the shoulder), there may not be much arm--or muscle--for IMES or surface electrodes to work with. So the next level of interface bypasses the residual muscle to tap into the peripheral nerves either with surgery or implanted electrodes. So far the team has had great success with the former, a technique called targeted muscle reinnervation. Pioneered at one of APL's partners, the Rehabilitation Institute of Chicago (RIC), this surgery reroutes nerves that once led to the muscles controlling the native arm and opens a direct line between those nerves and the mechanical arm. To move a real arm, a nerve signal travels down the nerve as a result of an intention, and that spike causes twitches in the terminal muscle, which results in electric signals on the surface of the skin that are directed to engage the features of the hand and arm. In a an individual with both limbs, those nerves travel from the spinal cord down the shoulder over the clavicle and then into the armpit, where they connect to about 80 000 nerve fibers that allow the brain to communicate with the arm. When the arm is amputated at the shoulder, the residual nerves are still there, but the muscles they influence are not. So RIC's Todd Kuiken developed a surgical method to take those nerves and give them new ”home base” muscles--those still available in the chest. The surgery threads the nerves down under the clavicle, so that instead of extending to the armpit they now extend to the chest. After about six months, those nerves will spread into a saucer-size area of the chest muscle. That means that when the person tries to move his bicep, for example, a muscle in the chest will twitch in response.
Kuiken knew this would be the best chance at controlling the arm with intent, because the surface electrodes can't tell the difference between a muscle twitch in the forearm and a muscle twitch in the pectoral muscle. The result--the ability to move the artificial arm as if it were your own--is always the same. At DARPATech, Jesse Sullivan (an electrician whose arms were amputated at the shoulder in 2001 after he was nearly electrocuted by a high-voltage wire) used the Proto-1 arm to gesture unconsciously while talking, shake a reporter's hand without trying to adjust for grip strength, and eat small pieces of candy. The movements of the arm were totally--almost eerily--natural.
When Sullivan thinks about opening his hand or moving his elbow, instead of causing a twitch in the muscle of his hand, those spikes cause a twitch in his chest muscles. A surface electrode picks up those signals and translates them into mechanical motion. ”When Jesse flexes his wrist, he's doing it by thinking about making that motion,” says RIC's Kuiken. ”The limb figures out the signals and what those patterns are, and then it causes the limb to move the way he intended. He's not thinking about some series of steps; he's thinking about moving his normal, regular arm.”
Most notably, Sullivan was able to ”feel” pressure when it was applied to his index finger and not on some other part of his body, where a small vibrating motor might vibrate. When the reporter squeezed the tip of the Proto-1 arm's thumb, Sullivan showed her the location on the hand where he could ”feel” the pressure in his phantom limb: at the base of the thumb. Early last year, Kuiken had discovered that in addition to allowing intuitive muscle control, rerouted nerves unexpectedly sensitized the skin on the chest. But the volunteers didn't feel the sensation in the chest--they felt it in the phantom limb. Kuiken has been working to refine the map of the phantom limb to determine the connection between the exact location on a prosthetic finger and the corresponding feeling perceived in the phantom limb. He's hoping for a one-to-one match soon, so that when the tip of the thumb is squeezed, the corresponding feeling will be in the phantom thumb's tip. Kuiken says the patients were also able to feel hot and cold. Harshbarger says that some volunteers have been able to use sensory-encoding tactors developed at Northwestern University for the Proto-2 arm to distinguish paper from sandpaper.
But what if for whatever reason these unused areas of muscle are unavailable or damaged? Another way to access the peripheral nerves is with penetrating electrodes that intersect the nerves with what are essentially needles. Researchers at the University of Utah developed an implantable device called the Utah Slant Electrode Array (USEA), a 5-millimeter-square grid of 100 needlelike electrodes. These electrodes hold hundreds of different mechanisms, among them signal amplifiers, storage registers, and a multiplexing scheme to transmit to a receiver on the skin. Like the injectable sensors, these can be powered wirelessly and extract a signal in real time. Unlike IMES, however, the electrode arrays access the nerves directly, instead of the muscles obeying the nerves. That subtracts, in theory, another layer of signal interference. The electrode arrays are still in experimental stages. Harshbarger says they will be used this year.
Finally, the most extreme solution is meant for people whose bodies no longer offer any means for interfacing to the artificial limb, for whom even nerve-rerouting surgery may not be an option. In such cases, the Utah electrode arrays are relocated to the source of all neural signals--the brain's motor cortex, which is right at the top of the head, toward the back of the frontal lobe. The electrode arrays are either placed on the inside surface of the top of the skull near the motor cortex or penetrate directly into the motor cortex. A device very much like the skull-mounted USEA has already been proven to pick up the brain's electrical signals and is currently used to warn epileptic patients of impending seizures. When electrodes penetrate directly into the motor cortex, embedded electronic circuits intercept the motor neurons firing their instructions and, with the help of complex algorithms, translate the related signals into a language that can control the mechanics of the arms.
It may take combinations of these methods to offer the most natural control and feedback. As a result, Harshbarger wants to push the limits of all four tiers of interface. Rather than expecting the user to learn how to control the prosthetic limb, Harshbarger wants electronics that learn how to figure out what the user wants the arm to do. ”We don't want the user to have to learn a new strategy for activating muscles in order to control the limb,” he says. So far, Sullivan has been able to put on a hat without looking in a mirror. Harshbarger says his next goal is for Sullivan to start typing. ”Well,” he qualifies, making hunting and pecking motions, ”more like the way I type.”