Researchers have long been working to put the brain in direct communication with machines. Recent demonstrations have seen animals and humans controlling ever more complex devices—advanced robotic arms and even prosthetic limbs that can provide tactile sensations. Here’s how a monkey, a rat, and a man were able to move matter with their minds.
At the University of Pittsburgh’s MotorLab, Andrew Schwartz and his colleagues have taught a monkey to control a robotic manipulator with 7 degrees of freedom. The monkey received two brain implants, one in the hand area and another in the arm area of its motor cortex. When the researchers place a knob at arbitrary positions in front of the animal, it can maneuver the robotic arm to grasp the knob. Then the monkey can precisely turn the knob by controlling the arm’s mechanical wrist. It’s probably the most complex machine a monkey has ever mastered with its thoughts alone.
Justin Sanchez and his colleagues at the University of Florida’s Neuroprosthetics Research Group, in Gainesville, are developing a new class of software decoders to translate brain activity into control signals for prosthetic devices. Although other brain-machine interfaces are static, these decoders adapt their parameters as the brain itself does when learning a new task. Using them, graduate student Babak Mahmoudi showed how a rat could learn to control a robotic gripper that bears no resemblance to its own limbs. Essentially, the rat used brain activity to perform an action unrelated to any movement it could make on its own, as if the decoder were an extension of the rat’s brain.
A group of scientists at the Scuola Superiore Sant’Anna di Pisa and the Università Campus Bio-Medico di Roma, in Italy, have demonstrated how an amputee can control a robotic hand after having electrodes surgically implanted on two different nerves of his arm. The electrodes captured signals originating in the man’s brain, allowing him to wiggle the mechanical fingers and hold objects. What’s more, the nerve-machine interface was bidirectional: The robotic hand had touch sensors that could send electrical signals back to the nerves, and the man’s brain translated them into a touch felt on a hand lost years before.
This article originally appeared in print as "Mammalian Mind Over Matter."