The July 2022 issue of IEEE Spectrum is here!

Close bar

Bionic Arms Get a Thought-Control Upgrade

Pattern recognition software enables amputees to control prostheses in a natural and intuitive way

3 min read
A prosthetic forearm and hand with a Coapt device
Photo: Matthew Stout/Sikich

Jodie O’Connell-Ponkos used a prosthetic arm for five years, until the day she threw it across the room in frustration. “Hate was an understatement,” recalls O’Connell-Ponkos, who lost part of her right arm in an industrial meat grinder at the age of 16 in 1985. She didn’t use another prosthesis for 20 years.

O’Connell-Ponkos’ story is common among upper-limb amputees: Despite advances in engineering and availability, the rate of users abandoning upper-limb prosthetics hadn’t changed in 25 years as of 2007, with up to 75% of users rejecting electric prosthetics.

One of the reasons may be that despite better materials, more powerful motors and additional joints, upper-limb prostheses still relied on controls developed in the 1950s. These used either body-powered maneuvers involving clunky cables and harnesses or myoelectric systems, which use electronic sensors resting on the skin of the amputation site to detect muscle activity and translate that activity into motion. The clenching of a bicep, for example, might bend an artificial elbow. It wasn’t intuitive, and often required extensive practice.

Then, last year, O’Connell-Ponkos tried a prosthetic arm enhanced with an new control system that can recognize subtle nerve signals, built by Chicago-based engineering company Coapt. Unlike the prosthesis she used as a teenager, the new arm allowed her to move more naturally, even gracefully. Today, the outgoing horse trainer wears the prosthesis constantly, relying on it for everything from chopping wood to putting her hair in a ponytail.

This recent advance to a natural, intuitive control system for upper-limb prosthetics is notable, if largely overlooked. At this year’s American Orthotic and Prosthetic Association conference in Boston, I had to search for Coapt’s small booth, tucked away in the exhibit hall behind rows of splashy orthotics and leg prosthetics. There, O’Connell-Ponkos, now a paid spokesperson for Coapt, was promoting the technology, which is compatible with the five major prosthetic manufacturers.

Coapt hit the market in late 2013, and an estimated 200 individuals today use the system, says company co-founder and CEO Blair Lock. The system, encased in a small black box, consists of a circuit board and set of algorithms that use pattern recognition to decode electric signals from arm muscles, working as a bridge between the user’s thoughts and the prosthesis.

imgThe next generation Coapt device (mockup in orange) will be significantly smaller and thinner than the original (in black)Photo: Megan Scudellari

Muscles act like loudspeakers to amplify nerve impulses—which are too quiet to be detected alone—and contain a “symphony” of information, says Lock. A traditional myoelectric system only detects the volume of the music, but pattern recognition software can link the pattern of a specific brain signal, like a particular song, to a movement.

The company has plans to premiere a smaller, second-generation version soon, and recently licensed an implantable electrode technology from Purdue University to read electrical signals from under the skin, though Lock was tight-lipped about plans for the technology.

Coapt isn’t alone in changing the way upper-limb prostheses are controlled: Two other leading prosthetic efforts include advanced control systems. The John Hopkins’ Modular Prosthetic Limb (MPL) can also be operated with pattern recognition software, and DEKA Research’s “LUKE Arm,” named for Luke Skywalker's prosthesis in Star Wars, has used the Coapt system and also boasts a wireless foot control system. Both the MPL and LUKE Arm were funded by DARPA. Neither is yet commercially available, though the LUKE Arm is scheduled to launch late this year.

The Hopkins’ MPL pattern recognition system was developed in-house, says Mike McLaughlin, chief engineer for research and exploratory development at the Johns Hopkins Applied Physics Laboratory, which created the MPL. “The idea is that we’re able to translate thoughts into motion.”

The LUKE Arm can be controlled in several ways, including with the Coapt system, says Tom Doyon, part of the DEKA Research team that developed the LUKE Arm. Uniquely, the LUKE Arm can also be controlled with a wireless foot control that acts as a joystick to move the arm in preprogrammed patterns.

None of the aforementioned prostheses, however, can be controlled like a natural hand. Even the best control systems execute a set of pre-programmed movements, rather than total freedom. With the Coapt system, for example, an individual can pre-program about six to eight movements—such as pointing, or pinching, or making a fist—for everyday use.

For now, the limiting factor isn’t the technology of the arm—the MPL, for example, has 26 joints and a couple hundred sensors—but the bandwidth required to decipher signals from the brain. “If you move your arm, there are probably 500 million neurons involved. Right now, the best we can do is see a few hundred of those neurons,” says McLaughlin. “We have all this stuff going on in our heads, and we have very limited capability of observing it.”

The future of prosthetic control hopes to tap into the brain’s symphony directly, by implanting electrodes under the skin or even directly into the brain. The Hopkins’ MPL team, in conjunction with the University of Pittsburg, recently tested brain electrode implants in two patients with severe spinal cord injuries. Ideally, the technology will someday be non-invasive, says McLaughlin, “but we’re still not there yet. Give us another year or so.”

The Conversation (0)
A photo showing machinery in a lab

Foundries such as the Edinburgh Genome Foundry assemble fragments of synthetic DNA and send them to labs for testing in cells.

Edinburgh Genome Foundry, University of Edinburgh

In the next decade, medical science may finally advance cures for some of the most complex diseases that plague humanity. Many diseases are caused by mutations in the human genome, which can either be inherited from our parents (such as in cystic fibrosis), or acquired during life, such as most types of cancer. For some of these conditions, medical researchers have identified the exact mutations that lead to disease; but in many more, they're still seeking answers. And without understanding the cause of a problem, it's pretty tough to find a cure.

We believe that a key enabling technology in this quest is a computer-aided design (CAD) program for genome editing, which our organization is launching this week at the Genome Project-write (GP-write) conference.

With this CAD program, medical researchers will be able to quickly design hundreds of different genomes with any combination of mutations and send the genetic code to a company that manufactures strings of DNA. Those fragments of synthesized DNA can then be sent to a foundry for assembly, and finally to a lab where the designed genomes can be tested in cells. Based on how the cells grow, researchers can use the CAD program to iterate with a new batch of redesigned genomes, sharing data for collaborative efforts. Enabling fast redesign of thousands of variants can only be achieved through automation; at that scale, researchers just might identify the combinations of mutations that are causing genetic diseases. This is the first critical R&D step toward finding cures.

Keep Reading ↓Show less