Brain Implants and Wearables Let Paralyzed People Move Again

A “neural bypass” routes signals around the damaged spinal cord, potentially restoring both movement and sensation

12 min read
Photo of a man in a wheelchair.

Motion Restored: Luke Tynan, who was paralyzed in 2017 by a spinal cord injury, demonstrates the wearable system that enables him to control his arm and hand. Sensors on the arm register his intentions, while electrodes stimulate the nerves and muscles to produce his desired movements.

Photo: Nathaniel Welch

In 2015, a group of neuroscientists and engineers assembled to watch a man play the video game Guitar Hero. He held the simplified guitar interface gingerly, using the fingers of his right hand to press down on the fret buttons and his left hand to hit the strum bar. What made this mundane bit of game play so extraordinary was the fact that the man had been paralyzed from the chest down for more than three years, without any use of his hands. Every time he moved his fingers to play a note, he was playing a song of restored autonomy.

His movements didn't rely on the damaged spinal cord inside his body. Instead, he used a technology that we call a neural bypass to turn his intentions into actions. First, a brain implant picked up neural signals in his motor cortex, which were then rerouted to a computer running machine-learning algorithms that deciphered those signals; finally, electrodes wrapped around his forearm conveyed the instructions to his muscles. He used, essentially, a type of artificial nervous system.

We did that research at the Battelle Memorial Institute in Columbus, Ohio. I've since moved my lab to the Institute of Bioelectronic Medicine at the Feinstein Institutes for Medical Research, in Manhasset, N.Y. Bioelectronic medicine is a relatively new field, in which we use devices to read and modulate the electrical activity within the body's nervous system, pioneering new treatments for patients. My group's particular quest is to crack the neural codes related to movement and sensation so we can develop new ways to treat the millions of people around the world who are living with paralysis—5.4 million people in the United States alone. To do this we first need to understand how electrical signals from neurons in the brain relate to actions by the body; then we need to “speak" the language correctly and modulate the appropriate neural pathways to restore movement and the sense of touch. After working on this problem for more than 20 years, I feel that we've just begun to understand some key parts of this mysterious code.

Photo of people in masks. Group Effort: Study participant Luke Tynan (front) works with a team of researchers to try out the wearable neural bypass. From left: Chad Bouton, Richard Ramdeo, Santosh Chandrasekaran, and Nikunj Bhagat. Photo: Nathaniel Welch

My team, which includes electrical engineer Nikunj Bhagat, neuroscientist Santosh Chandrasekaran, and clinical manager Richard Ramdeo, is using that information to build two different kinds of synthetic nervous systems. One approach uses brain implants for high-fidelity control of paralyzed limbs. The other employs noninvasive wearable technology that provides less precise control, but has the benefit of not requiring brain surgery. That wearable technology could also be rolled out to patients relatively soon.

Ian Burkhart, the participant in the Guitar Hero experiment, was paralyzed in 2010 when he dove into an ocean wave and was pushed headfirst into a sandbar. The impact fractured several vertebrae in his neck and damaged his spinal cord, leaving him paralyzed from the middle of his chest down. His injury blocks electrical signals generated by his brain from traveling down his nerves to trigger actions by his muscles. During his participation in our study, technology replaced that lost function. His triumphs—which also included swiping a credit card and pouring water from a bottle into a glass—were among the first times a paralyzed person had successfully controlled his own muscles using a brain implant. And they pointed to two ways forward for our research.

Photo of a man looking at a screen Rocking Out: In 2015, study participant Ian Burkhart used the first version of the implant-based neural bypass to play the game Guitar Hero. Photo: Ohio State University Wexner Medical Center/Batelle

The system Burkhart used was experimental, and when the study ended, so did his new autonomy. We set out to change that. In one thrust of our research, we're developing noninvasive wearable technology, which doesn't require a brain implant and could therefore be adopted by the paralyzed community fairly quickly. As I'll describe later in this article, tetraplegic people are already using this system to reach out and grasp a variety of objects. We are working to commercialize this noninvasive technology and hope to gain clearance from the U.S. Food and Drug Administration within the next year. That's our short-term goal.

We're also working toward a long-term vision of a bidirectional neural bypass, which will use brain implants to pick up the signals close to the source and to return feedback from sensors we'll place on the limb. We hope this two-way system will restore both motion and sensation, and we've embarked on a clinical trial to test this approach. We want people like Burkhart to feel the guitar as they make music with their paralyzed hands.

Paralysis used to be considered a permanent condition. But in the past two decades, there's been remarkable progress in reading neural signals from the brain and using electrical stimulation to power paralyzed muscles.

In the early 2000s, the BrainGate consortium began groundbreaking work with brain implants that picked up signals from the motor region of the brain, using those signals to control various machines. I had the privilege of working with the consortium during the early years and developed machine-learning algorithms to decipher the neural code. In 2007, those algorithms helped a woman who was paralyzed due to a stroke drive a wheelchair with her thoughts. By 2012, the team had enabled a paralyzed woman to use a robotic arm to pick up a bottle. Meanwhile, other researchers were using implanted electrodes to stimulate the spinal cord, enabling people with paralyzed legs to stand up and even walk.

My research group has continued to tackle both sides of the problem: reading the signals from the brain as well as stimulating the muscles, with a focus on the hands. Around the time that I was working with the BrainGate team, I remember seeing a survey that asked people with spinal cord injuries about their top priorities. Tetraplegics—that is, people with paralysis of all four limbs—responded that their highest priority was regaining function in their arms and hands.

Robotics has partially filled that need. One commercially available robotic arm can be operated with wheelchair controls, and studies have explored controlling robotic arms through brain implants or scalp electrodes. But some people still long to use their own arms. When Burkhart spoke to the press in 2016, he said that he'd rather not have a robotic arm mounted on his wheelchair, because he felt it would draw too much attention. Unobtrusive technology to control his own arm would allow him “to function almost as a normal member of society," he said, “and not be treated as a cyborg."

Restoring movement in the hands is a daunting challenge. The human hand has more than 20 degrees of freedom, or ways in which it can move and rotate—that's many more than the leg. That means there are many more muscles to stimulate, which creates a highly complex control-systems problem. And we don't yet completely understand how all of the hand's intricate movements are encoded in the brain. Despite these challenges, my group set out to give tetraplegics back their hands.

Burkhart's implant was in his brain's motor cortex, in a region that controls hand movements. Researchers have extensively mapped the motor cortex, so there's plenty of information about how general neuronal activity there correlates with movements of the hand as a whole, and each finger individually. But the amount of data coming off the implant's 96 electrodes was formidable: Each one measured activity 30,000 times per second. In this torrent of data, we had to find the discrete signals that meant “flex the thumb" or “extend the index finger."

To decode the signals, we used a combination of artificial intelligence and human perseverance. Our stalwart volunteer attended up to three sessions weekly for 15 weeks to train the system. In each session, Burkhart would watch an animated hand on a computer screen move and flex its fingers, and he'd imagine making the same movements while the implant recorded his neurons' activity. Over time, a machine-learning algorithm figured out which pattern of activity corresponded to the flexing of a thumb, the extension of an index finger, and so on.

Once our neural-bypass system understood the signals, it could generate a pattern of electrical pulses for the muscles of Burkhart's forearm, in theory mimicking the pulses that the brain would send down an undamaged spinal cord and through the nerves. But in reality, translating Burkhart's intentions to muscle movements required another intense round of training and calibration. We spent countless hours stimulating different sets of the 130 electrodes wrapped around his forearm to determine how to control the muscles of his wrist, hand, and each finger. But we couldn't duplicate all of the movements the hand can make, and we never quite got control of the pinkie! We knew we had to develop something better.

Grab a Snack: Casey Ellin, who was partially paralyzed by a spinal cord injury, tests out an earlier prototype version of the wearable neural bypass system. Video: The Feinstein Institutes for Medical Research

To make a more practical and convenient system, we decided to develop a version that is completely noninvasive, which we call GlidePath. We recruited volunteers who have spinal cord injuries but still have some mobility in their shoulders. We placed a proprietary mix of inertial and biometric sensors on the volunteers' arms, and asked them to imagine reaching for different objects. The data from the sensors fed into a machine-learning algorithm, enabling us to infer the volunteers' grasping intentions. Flexible electrodes on their forearms then stimulated their muscles in a particular sequence. In one session, volunteer Casey Ellin used this wearable bypass to pick up a granola bar from a table and bring it to his mouth to take a bite. We published these results in 2020 in the journal Bioelectronic Medicine.

My team is working to integrate the sensors and the stimulators into lightweight and inconspicuous wearables; we're also developing an app that will be paired with the wearable, so that clinicians can check and adjust the stimulation settings. This setup will allow for remote rehabilitation sessions, because the data from the app will be uploaded to the cloud.

To speed up the process of calibrating the stimulation patterns, we're building a database of how the patterns map to hand movements, with the help of both able-bodied and paralyzed volunteers. While each person responds differently to stimulation, there are enough similarities to train our system. It's analogous to Amazon's Alexa voice assistant, which is trained on thousands of voices and comes out of the box ready to go—but which over time further refines its understanding of its specific users' speech patterns. Our wearables will likewise be ready to go immediately, offering basic functions like opening and closing the hand. But they'll continue to learn about their users' intentions over time, helping with the movements that are most important to each user.

Photos of a man holding a device and a a photo of a device on the arm. Patching Through: Chad Bouton (left) holds the latest version of the wearable patch that stimulates nerves and muscles when placed on the user's forearm (right). Photos: Nathaniel Welch

We think this technology can help people with spinal cord injuries as well as people recovering from strokes, and we're collaborating with Good Shepherd Rehabilitation Hospital and the Barrow Neurological Institute to test our technology. Stroke patients commonly receive neuromuscular electrical stimulation, to assist with voluntary movements and help recover motor function. There's considerable evidence that such rehab works better when a patient actively tries to make a movement while electrodes stimulate the proper muscles; that connected effort by brain and muscles has been shown to increase “plasticity," or the ability of the nervous system to adapt to damage. Our system will ensure that the patient is fully engaged, as the stimulation will be triggered by the patient's intention. We plan to collect data over time, and we hope to see patients eventually regain some function even when the technology is turned off.

As exciting as the wearable applications are, today's noninvasive technology doesn't readily control complex finger movements, at least initially. We don't expect the GlidePath technology to immediately enable people to play Guitar Hero, much less a real guitar. So we've continued to work on a neural bypass that involves brain implants.

When Burkhart used the earlier version of the neural bypass, he told us that it offered a huge step toward independence. But there were a lot of practical things we hadn't considered. He told us, “It is strange to not feel the object I'm holding." Daily tasks like buttoning a shirt require such sensory feedback. We decided then to work on a two-way neural bypass, which conveyed movement commands from the brain to the hand and sent sensory feedback from the hand to the brain, skipping over the damaged spinal cord in both directions.

The Two-Way Bypass

To enable a paralyzed person to pick up an object, implanted electrode arrays in the motor cortex (1) pick up the neural signals generated as the person imagines moving his arm and hand. Those noisy brain signals are then decoded by an AI-powered processor (2), which sends nerve-stimulation instructions to an electrode patch (3) on the person's forearm. As the person grabs the object, thin-film sensors on the hand (4) register the sensory information. That data passes back through the processor, and stimulation instructions are sent to implanted electrode arrays in the sensory cortex (5)—allowing the person to “feel" the object and adjust his grip if necessary. Another electrode array on the spinal cord (6) stimulates the spinal nerves during this process, in hopes of encouraging regrowth and repair.

The Wearable Bypass

Paralyzed people with some motor function remaining in their arms can make use of a less invasive, though less precise, approach. A patch on the forearm (3) registers biometric signals as the person attempts to use his hand. Those noisy biometric signals are decoded by an AI-powered processor (2), which sends nerve-stimulation instructions to electrodes on that same forearm patch.

To give people sensation from their paralyzed hands, we knew that we'd need both finely tuned sensors on the hand and an implant in the sensory cortex region of the brain. For the sensors, we started by thinking about how human skin sends feedback to the brain. When you pick something up—say, a disposable cup filled with coffee—the pressure compresses the underlying layers of skin. Your skin moves, stretches, and deforms as you lift the cup. The thin-film sensors we developed can detect the pressure of the cup against the skin, as well as the shear (transverse) force exerted on the skin as you lift the cup and gravity pulls it down. This delicate feedback is crucial, because there's a very narrow range of appropriate movement in that circumstance; if you squeeze the cup too tightly, you'll end up with hot coffee all over you.

Each of our sensors has different zones that detect the slightest pressure or shear force. By aggregating the measurements, our system determines exactly how the skin is bending or stretching. The processor will send that information to the implants in the sensory cortex, enabling a user to feel the cup in their hand and adjust their grip as needed.

Photo of a hand holding an egg. Touch and Feel: An fMRI image (top) shows brain activity associated with hand movements. The two-way bypass records from the motor cortex and stimulates the sensory cortex. Thin-film sensors (bottom) measure pressure and force; that data goes to stimulating electrodes in the sensory cortex. Photos, top: The Feinstein Institutes for Medical Research; bottom: Abigail Bouton

Figuring out exactly where to stimulate the sensory cortex was another challenge. The part of the sensory cortex that receives input from the hand hasn't been mapped exhaustively via electrodes, in part because the regions dealing with the fingertips are tucked into a groove in the brain called the central sulcus. To fill in this blank spot on the map, we worked with our neurosurgeon colleagues Ashesh Mehta and Stephan Bickel, along with hospitalized epilepsy patients who underwent procedures to map their seizure activity. Depth electrodes were used to stimulate areas within that groove, and patients were asked where they felt sensation. We were able to elicit sensation in very specific parts of the hand, including the crucial fingertips.

That knowledge prepared us for the clinical trial that marks the next step in our research. We're currently enrolling volunteers with tetraplegia for the study, in which the neurosurgeons on our team will implant three arrays of electrodes in the sensory cortex and two in the motor cortex. Stimulating the sensory cortex will likely bring new challenges for the decoding algorithms that interpret the neural signals in the motor cortex, which is right next door to the sensory cortex—there will certainly be some changes to the electrical signals we pick up, and we'll have to learn to compensate for them.

In the study, we've added one other twist. In addition to stimulating the forearm muscles and the sensory cortex, we're also going to stimulate the spinal cord. Our reasoning is as follows: In the spinal cord, there are 10 million neurons in complex networks. Earlier research has shown that these neurons have some ability to temporarily direct the body's movements even in the absence of commands from the brain. We'll have our volunteers concentrate on an intended movement, to physically make the motion with the help of electrodes on the forearm, and receive feedback from the sensors on the hand. If we stimulate the spinal cord while this process is going on, we believe we can promote plasticity within its networks, strengthening connections between neurons within the spinal cord that are involved in the hand's movements. It's possible that we'll achieve a restorative effect that lasts beyond the duration of the study: Our dream is to give people with damaged spinal cords their hands back.

One day, we hope that brain implants for people with paralysis will be clinically proven and approved for use, enabling them to go well beyond playing Guitar Hero. We'd like to see them making complex movements with their hands, such as tying their shoes, typing on a keyboard, and playing scales on a piano. We aim to let these people reach out to clasp hands with their loved ones and feel their touch in return. We want to restore movement, sensation, and ultimately their independence.

This article appears in the February 2021 print issue as “Bypassing Paralysis."

The Conversation (0)

Q&A With Co-Creator of the 6502 Processor

Bill Mensch on the microprocessor that powered the Atari 2600 and Commodore 64

5 min read
Bill Mensch

Few people have seen their handiwork influence the world more than Bill Mensch. He helped create the legendary 8-bit 6502 microprocessor, launched in 1975, which was the heart of groundbreaking systems including the Atari 2600, Apple II, and Commodore 64. Mensch also created the VIA 65C22 input/output chip—noted for its rich features and which was crucial to the 6502's overall popularity—and the second-generation 65C816, a 16-bit processor that powered machines such as the Apple IIGS, and the Super Nintendo console.

Many of the 65x series of chips are still in production. The processors and their variants are used as microcontrollers in commercial products, and they remain popular among hobbyists who build home-brewed computers. The surge of interest in retrocomputing has led to folks once again swapping tips on how to write polished games using the 6502 assembly code, with new titles being released for the Atari, BBC Micro, and other machines.

Keep Reading ↓ Show less

Spot’s 3.0 Update Adds Increased Autonomy, New Door Tricks

Boston Dynamics' Spot can now handle push-bar doors and dynamically replan in complex environments

5 min read
Boston Dynamics

While Boston Dynamics' Atlas humanoid spends its time learning how to dance and do parkour, the company's Spot quadruped is quietly getting much better at doing useful, valuable tasks in commercial environments. Solving tasks like dynamic path planning and door manipulation in a way that's robust enough that someone can buy your robot and not regret it is, I would argue, just as difficult (if not more difficult) as getting a robot to do a backflip.

With a short blog post today, Boston Dynamics is announcing Spot Release 3.0, representing more than a year of software improvements over Release 2.0 that we covered back in May of 2020. The highlights of Release 3.0 include autonomous dynamic replanning, cloud integration, some clever camera tricks, and a new ability to handle push-bar doors, and earlier today, we spoke with Spot Chief Engineer at Boston Dynamics Zachary Jackowski to learn more about what Spot's been up to.

Keep Reading ↓ Show less

Help Build the Future of Assistive Technology

Empower those in need with a master’s degree in assistive technology engineering

4 min read

Students in the CSUN Assistive Technology Engineering program work on projects that involve robotics, artificial intelligence, and neuroscience.

California State University, Northridge (CSUN)

This article is sponsored by California State University, Northridge (CSUN).

Your smartphone is getting smarter. Your car is driving itself. And your watch tells you when to breathe. That, as strange as it might sound, is the world we live in. Just look around you. Almost every day, there's a better or more convenient version of the latest gadget, device, or software. And that's only on the commercial end. The medical and rehabilitative tech is equally impressive — and arguably far more important. Because for those with disabilities, assistive technologies mean more than convenience. They mean freedom.

So, what is an assistive technology (AT), and who designs it? The term might be new to you, but you're undoubtedly aware of many: hearing aids, prosthetics, speech-recognition software (Hey, Siri), even the touch screen you use each day on your cell phone. They're all assistive technologies. AT, in its most basic form, is anything that helps a person achieve enhanced performance, improved function, or accelerated access to information. A car lets you travel faster than walking; a computer lets you process data at an inhuman speed; and a search engine lets you easily find information.

Keep Reading ↓ Show less

Trending Stories

The most-read stories on IEEE Spectrum right now