The December 2022 issue of IEEE Spectrum is here!

Close bar

Engineers Work on Laser-Based Brain-Machine Interface for Prosthetic Arm

Laser stimulation of nerves may light the way to better nervous-system feedback for prosthetics

3 min read

18 February 2008—Biomedical engineers are working to develop reliable brain-machine interfaces that will someday let amputees manipulate prosthetic limbs as naturally as they do their native ones. But hacking the nervous system is easier said than done. Today’s state-of-the-art method for connecting to the human nervous system is to deliver electrical pulses near a particular nerve cell to elicit a response, such as a muscle twitch or a sensation. The trouble is that the electrode that delivers the pulse creates a halo of charge that triggers nearby nerve fibers. The effect is similar to that of crosstalk on telecommunications lines. Thus, the brain might misinterpret a jolt from a prosthetic arm intended to indicate that only the index finger is pressed against an object as confirmation that the entire artificial hand has grasped the object.

But researchers at Vanderbilt University, in Nashville, think they’ve found a better way. Late last year, they began clinical tests using a portable solid-state laser that can stimulate nerves more effectively and more precisely than electricity. Using a similar laser aimed at the sciatic nerve of laboratory rats, they caused some part of the animal’s legs to involuntarily twitch with each laser pulse. A slight movement of the beam across the nerve bundle—which causes the narrow beam to shift its focus from one fiber within the nerve to another—can cause the rat to switch from, say, curling its toes to flexing its foot.

Keep Reading ↓Show less

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less