Mind-Controlled Prosthetic Hands Grasp New Feats

Users can move individual fingers simply by thinking about it

3 min read

A study participant controls his prosthetic hand with a new peripheral nerve interface.

A study participant controls his prosthetic hand with a new peripheral nerve interface, whereby he simply thinks about moving individual fingers on his prosthetic hand and it does the corresponding movement.

Luu et al.

Many exciting advances in the field of mind-controlled artificial limbs have happened over the past decade; now, another milestone brings us one step closer to seamless integration of human and machine. In a recent study, three amputees simply thought about moving an individual finger of their prosthetic hands and were able to do so.

The advance hinges on a new AI decoder, which is able to interpret nerve signals at the terminus of an amputated limb with unprecedented specificity. It allows users to intuitively control a prosthetic hand, with wrist and individual finger movements, with 97 to 98 percent accuracy. The results are described in a study published on 18 March inIEEE Transactions on Biomedical Engineering.

There are three main approaches for getting a machine to interface with a person’s nervous system: via the brain, muscle, or peripheral nerves. Elon Musk’s Neuralink brain implant is an example of a brain-interfacing system. But while brain implants perhaps offer the most comprehensive human-machine interconnection, that approach comes with the risk of damaging brain tissue. Systems that interface with the muscles at the end of an amputated limb using electromyography (EMG) offer the lowest degree of control and are the least intuitive to the users.

With that in mind, Zhi Yang, a University of Minnesota biomedical engineering professor, and a team comprising colleagues from Minnesota and collaborators from the University of Texas Southwestern Medical Center, have focused on developing a peripheral nerve interface. This approach involves a handful of implants within the nerves at the terminus of the amputated limb.

With this setup, users simply think of the movement they want the artificial hand to make. A user’s brain sends the command signals to the nerves at the connection point, where the interface detects the signals. An AI algorithm then decodes signals and sends the corresponding command to the artificial limb, which executes the brain’s instructions.

In this video, a study participant uses his thoughts to move his able and prosthetic hand in unison.Luu et al.

The team previously developed a sophisticated nerve interface capable of capturing high-quality data from nerve signals. In their most recent work, they developed the novel AI decoder, which analyzes the nerve signals unique to individual users.

“The combination of AI and a peripheral nerve interface offers tremendous advantages over the current approaches, such as a brain-machine interface based on cortical recordings or prosthetics based on muscle signals,” says Yang. “For example, it is the only technology today that allows amputees to control individual finger movements.”

As well, Yang notes, the new system benefits from quick processing, with a decoding rate of 6 bits per second. In contrast, other brain-machine interfaces, whether a brain implant or EMG control, typically process less than 1 bit per second.

Crucially, the Minnesota-Texas team’s newest advance allows users to control their artificial limbs via natural thought, whereas existing technologies require the user to execute a complex and unnatural sequence of muscle contractions in order to move the artificial limb. For example, the user may have to flex muscle A twice quickly and then muscle B once to prompt the artificial hand to close all of its fingers in a grasping motion—which creates a steep learning curve for users as they are trained on how to use the new limb.

In contrast, learning how to use this new prosthetic system is much easier. The study participants wore a virtual-reality glove on the able hand, which captured data on that hand’s movement. They then imagined moving the phantom/injured hand while performing the same movement on the able hand simultaneously.

In this way, the AI decoder “learned” the unique nerve signals and corresponding hand movements of individual users. After several hours of training, which involved doing the same hand motions repeatedly, the users were able to move the prosthetic hands by natural thought, in whatever way they desired.

“It is mind reading at the [peripheral nervous system] level, where the [interface] system can interpret the user’s intent and turn that intent into action,” explains Yang. “The [user] thinks about moving his thumb; the motorized thumb moves. The user wants to make a fist; the hand makes a fist.”

He also notes that, because of the intraneural electrode placement, it’s possible to use the same electrodes to electrically stimulate the nerves to recreate a full palette of sensations such as touch, texture, vibration, and even hot or cold.

While this advance has the potential to help many amputees, the technology could be even more broadly applicable. For example, Yang and colleagues are interested in applying this work to other diseases (such as epilepsy, persistent pain, heart failure, and diabetes) that could be treated–or even potentially cured–using neuromodulation.

The Conversation (0)