Bionic Arms Get a Thought-Control Upgrade

Pattern recognition software enables amputees to control prostheses in a natural and intuitive way

A prosthetic forearm and hand with a Coapt device
Photo: Matthew Stout/Sikich

Jodie O’Connell-Ponkos used a prosthetic arm for five years, until the day she threw it across the room in frustration. “Hate was an understatement,” recalls O’Connell-Ponkos, who lost part of her right arm in an industrial meat grinder at the age of 16 in 1985. She didn’t use another prosthesis for 20 years.

O’Connell-Ponkos’ story is common among upper-limb amputees: Despite advances in engineering and availability, the rate of users abandoning upper-limb prosthetics hadn’t changed in 25 years as of 2007, with up to 75% of users rejecting electric prosthetics.

One of the reasons may be that despite better materials, more powerful motors and additional joints, upper-limb prostheses still relied on controls developed in the 1950s. These used either body-powered maneuvers involving clunky cables and harnesses or myoelectric systems, which use electronic sensors resting on the skin of the amputation site to detect muscle activity and translate that activity into motion. The clenching of a bicep, for example, might bend an artificial elbow. It wasn’t intuitive, and often required extensive practice.

Then, last year, O’Connell-Ponkos tried a prosthetic arm enhanced with an new control system that can recognize subtle nerve signals, built by Chicago-based engineering company Coapt. Unlike the prosthesis she used as a teenager, the new arm allowed her to move more naturally, even gracefully. Today, the outgoing horse trainer wears the prosthesis constantly, relying on it for everything from chopping wood to putting her hair in a ponytail.

This recent advance to a natural, intuitive control system for upper-limb prosthetics is notable, if largely overlooked. At this year’s American Orthotic and Prosthetic Association conference in Boston, I had to search for Coapt’s small booth, tucked away in the exhibit hall behind rows of splashy orthotics and leg prosthetics. There, O’Connell-Ponkos, now a paid spokesperson for Coapt, was promoting the technology, which is compatible with the five major prosthetic manufacturers.

Coapt hit the market in late 2013, and an estimated 200 individuals today use the system, says company co-founder and CEO Blair Lock. The system, encased in a small black box, consists of a circuit board and set of algorithms that use pattern recognition to decode electric signals from arm muscles, working as a bridge between the user’s thoughts and the prosthesis.

Muscles act like loudspeakers to amplify nerve impulses—which are too quiet to be detected alone—and contain a “symphony” of information, says Lock. A traditional myoelectric system only detects the volume of the music, but pattern recognition software can link the pattern of a specific brain signal, like a particular song, to a movement.

The company has plans to premiere a smaller, second-generation version soon, and recently licensed an implantable electrode technology from Purdue University to read electrical signals from under the skin, though Lock was tight-lipped about plans for the technology.

Coapt isn’t alone in changing the way upper-limb prostheses are controlled: Two other leading prosthetic efforts include advanced control systems. The John Hopkins’ Modular Prosthetic Limb (MPL) can also be operated with pattern recognition software, and DEKA Research’s “LUKE Arm,” named for Luke Skywalker's prosthesis in Star Wars, has used the Coapt system and also boasts a wireless foot control system. Both the MPL and LUKE Arm were funded by DARPA. Neither is yet commercially available, though the LUKE Arm is scheduled to launch late this year.

The Hopkins’ MPL pattern recognition system was developed in-house, says Mike McLaughlin, chief engineer for research and exploratory development at the Johns Hopkins Applied Physics Laboratory, which created the MPL. “The idea is that we’re able to translate thoughts into motion.”

The LUKE Arm can be controlled in several ways, including with the Coapt system, says Tom Doyon, part of the DEKA Research team that developed the LUKE Arm. Uniquely, the LUKE Arm can also be controlled with a wireless foot control that acts as a joystick to move the arm in preprogrammed patterns.

None of the aforementioned prostheses, however, can be controlled like a natural hand. Even the best control systems execute a set of pre-programmed movements, rather than total freedom. With the Coapt system, for example, an individual can pre-program about six to eight movements—such as pointing, or pinching, or making a fist—for everyday use.

For now, the limiting factor isn’t the technology of the arm—the MPL, for example, has 26 joints and a couple hundred sensors—but the bandwidth required to decipher signals from the brain. “If you move your arm, there are probably 500 million neurons involved. Right now, the best we can do is see a few hundred of those neurons,” says McLaughlin. “We have all this stuff going on in our heads, and we have very limited capability of observing it.”

The future of prosthetic control hopes to tap into the brain’s symphony directly, by implanting electrodes under the skin or even directly into the brain. The Hopkins’ MPL team, in conjunction with the University of Pittsburg, recently tested brain electrode implants in two patients with severe spinal cord injuries. Ideally, the technology will someday be non-invasive, says McLaughlin, “but we’re still not there yet. Give us another year or so.”

The Human OS Newsletter

Biomedical engineering in a biweekly newsletter. Expert insights into wearable sensors, big data analytics, and implanted devices for personalized medicine.

About the Human OS blog

IEEE Spectrum’s biomedical engineering blog, featuring the wearable sensors, big data analytics, and implanted devices that enable new ventures in personalized medicine.