A Prosthetic That Feels Pain

Electronic receptors mimic the ability of human skin to sense pain and pressure

3 min read
This graphic shows the process by which signals move from the e-dermis to the nervous systems of the wearer.
Courtesy of Osborn et al

By mimicking the natural abilities of our skin, a team of researchers at Johns Hopkins University has enabled a prosthesis to perceive and transmit the feeling of pain.

But why would anyone want to feel pain? Study author Nitish Thakor, a professor of biomedical engineering at Hopkins and IEEE Fellow, has been getting that question a lot.

In the most practical sense, pain sensors in the skin help protect our bodies from damaging objects, such as a hot stove or sharp knife. By the same token, an amputee could rely on the perception of pain to protect his or her prosthesis from damage, says Thakor.

But he also gives a more holistic, almost poetic answer: “We can now span a very human-like sense of perception, from light touch to pressure to pain, and I think that makes prosthetics more human.”

In a study published today in Science Robotics, Thakor, along with graduate student Luke Osborn and their colleagues, describe the design and initial test of their “e-dermis” system. It’s the latest in a ongoing effort to add a sense of touch to prosthetics, à la Luke Skywalker feeling a needle prick the fingers and palm of his bionic hand.

The Hopkins team was inspired by the way biological touch receptors work in human skin, says Thakor. Real skin consists of layers of receptors. Similarly, the e-dermis has numerous layers—made of piezoresistive and conductive fabrics, rather than different types of cells—that sense and measure pressure. Also like real skin, those sensing layers react in different ways to pressure: some react quickly to stimuli, while others respond more slowly.

The pressure information from the e-dermis is converted into neuron-like pulses that are similar to the spikes of electricity, or action potentials, that living neurons use to communicate. That neuron-like, or neuromorphic, signal is then delivered via small electrical stimulations to the peripheral nerves in the skin of an amputee to elicit feelings of pressure and, yes, pain.

Thanks to a dedicated volunteer who was not named in the study, the team was able to implement and test their system. Osborn spent two months mapping the peripheral nerves in the amputated left arm of a 29-year-old man who had an above-the-elbow amputation following an illness. Using small electrical stimulations, the graduate student mapped out how different peripheral nerves in the volunteer’s residual limb related to his feeling of a phantom limb.

“We can now span a very human-like sense of perception, from light touch to pressure to pain, and I think that makes prosthetics more human.”

During this process, Osborn discovered that the right amount of current delivered at a specific frequency elicited not only a sense of touch, but a sense of pain. (Not too much pain though—they stimulated the nerves until the volunteer felt a 3 out of 10 on a pain scale, Thakor carefully notes.)

The team then put the whole system in place—e-dermis on the fingers of the prosthetic, neuron-like signaling model in the prosthesis controller, and electrical stimulator on the residual limb. With the system, the volunteer could clearly distinguish between rounded and sharp objects and felt the sensation coming directly from his phantom limb. In an additional experiment, the prosthesis was programmed with a pain reflex so that it automatically released a sharp object when pain was detected.

In this single case study, the touch information was delivered to the nervous system by stimulating the skin of the amputee, but it could also be delivered via other technologies, such as implanted electrodes, targeted muscle reinnervation, and maybe, someday, brain-machine interfaces.

“Someday all this could be implanted to directly go to nerve rather than via skin, but this approach is available here and now,” says Thakor, who is also co-founder of a prosthetics company, Infinite Biomedical Technologies. Moving forward, his lab plans to investigate other materials for the e-dermis and explore how to deliver a wider range of sensations.

The technology also has possible applications in robotics and augmented reality, though Thakor declined to disclose any current ideas or projects in the works. But it’s clear that better tactile capabilities could help robots grasp objects better and perform a wider range of functions. And if the robotics industry adopted such a technology, mass manufacturing could lead to a dramatic decrease in cost and widespread adoption of such technologies. 

The Conversation (0)

This CAD Program Can Design New Organisms

Genetic engineers have a powerful new tool to write and edit DNA code

11 min read
A photo showing machinery in a lab

Foundries such as the Edinburgh Genome Foundry assemble fragments of synthetic DNA and send them to labs for testing in cells.

Edinburgh Genome Foundry, University of Edinburgh

In the next decade, medical science may finally advance cures for some of the most complex diseases that plague humanity. Many diseases are caused by mutations in the human genome, which can either be inherited from our parents (such as in cystic fibrosis), or acquired during life, such as most types of cancer. For some of these conditions, medical researchers have identified the exact mutations that lead to disease; but in many more, they're still seeking answers. And without understanding the cause of a problem, it's pretty tough to find a cure.

We believe that a key enabling technology in this quest is a computer-aided design (CAD) program for genome editing, which our organization is launching this week at the Genome Project-write (GP-write) conference.

With this CAD program, medical researchers will be able to quickly design hundreds of different genomes with any combination of mutations and send the genetic code to a company that manufactures strings of DNA. Those fragments of synthesized DNA can then be sent to a foundry for assembly, and finally to a lab where the designed genomes can be tested in cells. Based on how the cells grow, researchers can use the CAD program to iterate with a new batch of redesigned genomes, sharing data for collaborative efforts. Enabling fast redesign of thousands of variants can only be achieved through automation; at that scale, researchers just might identify the combinations of mutations that are causing genetic diseases. This is the first critical R&D step toward finding cures.

Keep Reading ↓ Show less