The February 2023 issue of IEEE Spectrum is here!

Close bar

A Prosthetic That Feels Pain

Electronic receptors mimic the ability of human skin to sense pain and pressure

3 min read
This graphic shows the process by which signals move from the e-dermis to the nervous systems of the wearer.
Courtesy of Osborn et al

By mimicking the natural abilities of our skin, a team of researchers at Johns Hopkins University has enabled a prosthesis to perceive and transmit the feeling of pain.

But why would anyone want to feel pain? Study author Nitish Thakor, a professor of biomedical engineering at Hopkins and IEEE Fellow, has been getting that question a lot.

In the most practical sense, pain sensors in the skin help protect our bodies from damaging objects, such as a hot stove or sharp knife. By the same token, an amputee could rely on the perception of pain to protect his or her prosthesis from damage, says Thakor.

But he also gives a more holistic, almost poetic answer: “We can now span a very human-like sense of perception, from light touch to pressure to pain, and I think that makes prosthetics more human.”

In a study published today in Science Robotics, Thakor, along with graduate student Luke Osborn and their colleagues, describe the design and initial test of their “e-dermis” system. It’s the latest in a ongoing effort to add a sense oftouch to prosthetics, à la Luke Skywalker feeling a needle prick the fingers and palm of his bionic hand.

The Hopkins team was inspired by the way biological touch receptors work in human skin, says Thakor. Real skin consists of layers of receptors. Similarly, the e-dermis has numerous layers—made of piezoresistive and conductive fabrics, rather than different types of cells—that sense and measure pressure. Also like real skin, those sensing layers react in different ways to pressure: some react quickly to stimuli, while others respond more slowly.

The pressure information from the e-dermis is converted into neuron-like pulses that are similar to the spikes of electricity, or action potentials, that living neurons use to communicate. That neuron-like, or neuromorphic, signal is then delivered via small electrical stimulations to the peripheral nerves in the skin of an amputee to elicit feelings of pressure and, yes, pain.

Thanks to a dedicated volunteer who was not named in the study, the team was able to implement and test their system. Osborn spent two months mapping the peripheral nerves in the amputated left arm of a 29-year-old man who had an above-the-elbow amputation following an illness. Using small electrical stimulations, the graduate student mapped out how different peripheral nerves in the volunteer’s residual limb related to his feeling of a phantom limb.

[shortcode ieee-pullquote quote=""We can now span a very human-like sense of perception, from light touch to pressure to pain, and I think that makes prosthetics more human."" float="left" expand=1]

During this process, Osborn discovered that the right amount of current delivered at a specific frequency elicited not only a sense of touch, but a sense of pain. (Not too much pain though—they stimulated the nerves until the volunteer felt a 3 out of 10 on a pain scale, Thakor carefully notes.)

The team then put the whole system in place—e-dermis on the fingers of the prosthetic, neuron-like signaling model in the prosthesis controller, and electrical stimulator on the residual limb. With the system, the volunteer could clearly distinguish between rounded and sharp objects and felt the sensation coming directly from his phantom limb. In an additional experiment, the prosthesis was programmed with a pain reflex so that it automatically released a sharp object when pain was detected.

In this single case study, the touch information was delivered to the nervous system by stimulating the skin of the amputee, but it could also be delivered via other technologies, such as implanted electrodes, targeted muscle reinnervation, and maybe, someday, brain-machine interfaces.

“Someday all this could be implanted to directly go to nerve rather than via skin, but this approach is available here and now,” says Thakor, who is also co-founder of a prosthetics company, Infinite Biomedical Technologies. Moving forward, his lab plans to investigate other materials for the e-dermis and explore how to deliver a wider range of sensations.

The technology also has possible applications in robotics and augmented reality, though Thakor declined to disclose any current ideas or projects in the works. But it’s clear that better tactile capabilities could help robots grasp objects better and perform a wider range of functions. And if the robotics industry adopted such a technology, mass manufacturing could lead to a dramatic decrease in cost and widespread adoption of such technologies. 

The Conversation (0)
Illustration showing an astronaut performing mechanical repairs to a satellite uses two extra mechanical arms that project from a backpack.

Extra limbs, controlled by wearable electrode patches that read and interpret neural signals from the user, could have innumerable uses, such as assisting on spacewalk missions to repair satellites.

Chris Philpot

What could you do with an extra limb? Consider a surgeon performing a delicate operation, one that needs her expertise and steady hands—all three of them. As her two biological hands manipulate surgical instruments, a third robotic limb that’s attached to her torso plays a supporting role. Or picture a construction worker who is thankful for his extra robotic hand as it braces the heavy beam he’s fastening into place with his other two hands. Imagine wearing an exoskeleton that would let you handle multiple objects simultaneously, like Spiderman’s Dr. Octopus. Or contemplate the out-there music a composer could write for a pianist who has 12 fingers to spread across the keyboard.

Such scenarios may seem like science fiction, but recent progress in robotics and neuroscience makes extra robotic limbs conceivable with today’s technology. Our research groups at Imperial College London and the University of Freiburg, in Germany, together with partners in the European project NIMA, are now working to figure out whether such augmentation can be realized in practice to extend human abilities. The main questions we’re tackling involve both neuroscience and neurotechnology: Is the human brain capable of controlling additional body parts as effectively as it controls biological parts? And if so, what neural signals can be used for this control?

Keep Reading ↓Show less
{"imageShortcodeIds":[]}