Inside an MRI, a Non-Metallic Robot Performs Prostate Surgery

If the robot masters prostate procedures, brain surgery may be next

5 min read
Inside an MRI, a Non-Metallic Robot Performs Prostate Surgery
Inside an MRI scanner, a plastic and piezoelectric robot goes to work on a prostate.
Photo: Gregory Fischer/WPI

One of the holy grails of robotic surgery is the ability to perform minimally invasive procedures guided by real-time scans from a magnetic resonance imaging, or MRI, machine. The problem is the space inside MRI scanners is tight for a person, let alone a person and a robot. What’s more, these machines use very strong magnetic fields, so metal is not a good thing to place inside of them, a restriction that is certainly a problem for robots.

Now researchers at Worcester Polytechnic Institute (WPI) are developing a MRI-compatible robotic surgery tool that can overcome those limitations. Their system isn’t made of metal, but instead has plastic parts and ceramic piezoelectric motors that allow it to work safely inside an MRI.

The tool is now being tested on human patients undergoing prostate biopsies at Boston’s Brigham and Women’s Hospital. The radiologists can use real-time MRI images to guide the movement of their robotic assistant, which they believe will provide unprecedented accuracy. (If you’re wondering how it works, here’s a descriptive line from a journal article the researchers published last year: “The patient lies inside the MRI scanner . . . and the robot accesses the prostate through the perineal wall.”) 

The robot, developed by WPI in collaboration with Brigham and Johns Hopkins University, also boasts a low-noise control system that doesn’t cause electrical interference. “Essentially, we made a device that can move around the MRI bore without affecting image quality,” says Gregory Fischer, a professor of mechanical engineering at WPI whose Automation and Interventional Medicine Robotics Lab led the research.  

So far, a dozen men have participated in a clinical trial assessing the feasibility and safety of robot-assisted prostate biopsies. The count should reach 20 by the end of July.

The typical biopsy that doctors perform to check a man for prostate cancer is far from a precise procedure, says the trial’s principle investigator Clare Tempany, a radiologist and director of the National Center for Image Guided Therapy at Brigham. Typically a physician targets the prostate, which is the size of “a small peach or plum,” she says, by placing a grid guide between the patient’s legs. Then the doctor inserts needles through the skin and into each quadrant to get tissue samples. “This is somewhat disparagingly called ‘the blind biopsy,’ ” says Tempany. “There’s no lesion targeted, it’s just: ‘Let’s push a bunch of needles in and see what we get.’ ”

These biopsies are sometimes aided by ultrasound imaging, which requires inserting a ultrasound wand into the patient’s rectum. Because ultrasound doesn’t provide clear enough images, doctors aren’t able to make precision strikes with their needles, says Tempany. They often end up taking 10 to 50 core samples, she says, and each needle stick carries the risk of infection.

This clinical trial literally shows a better way. Looking at real-time MRI images, the doctor can identify parts of the prostate that look suspicious, and direct the robotic tool to those spots. “In our procedure, a small little robot places the needle on the skin’s surface and says, ‘This is the spot where you need to push, and in five or six centimeters you’ll hit your target,’ ” Tempany explains. In the current trial the doctor takes the actual step of inserting the needle, and typically takes samples in just four locations. “It’s a smarter biopsy,” she says.


Fischer says one of the challenges in building this bot was ensuring that its RF emissions didn’t interfere with the magnetic and electric fields used in MRI machines. His group first experimented with pneumatic actuators to move the device. “If you use air there are no electrical signals, so it seems like it should be great,” Fischer says. But he wasn’t satisfied with the results. “It’s very hard to use air pressure to control things precisely—you get jitter and overshoot,” he says.

His team knew they could choose between a number of commercially available piezoelectric motors, which convert an electrical signal into oscillations that drive the robot. The drivers for these off-the-shelf motors are typically designed to be low-power and low-cost, says lab member Hao Su, but unfortunately they’re not low-noise. To get around this problem, Su and his colleagues replaced the typical noisy drivers with a new piezoelectric drive system that uses a direct digital synthesizer, which generates precise waveforms to drive the actuators. 

The team also packed all the control electronics into a carrying case that sits beside the MRI machine, communicating with the bot inside via a fiber-optic cable. “You can bring it into any MRI room and have it up and running in an hour,” Fischer says. 

In the current biopsy procedure, the doctor specifies the device’s location and also takes the crucial step of inserting the needles. With the doctor so clearly in control, you might ask: Does this tool really qualify as a robot? Fischer argues that the device does have some autonomy, as it senses where it is inside the scanner and positions its joints to move itself to the specified location.

And he adds that in lab tests they’ve proven that the system is capable of autonomously steering and inserting the needles itself based on its reading of MRI images. “It can locate the target, track the needle, and if it deflects during insertion, it can steer the needle to hit the target,” he says. However, that advanced system isn’t yet ready for clinical testing. “We’re taking baby steps to get the robot into clinical use,” he says.


Fischer and his research partners are working with Acoustic Medsystems to commercialize the device for both prostate cancer diagnostics and treatment. That company specializes in tools used in a prostate cancer treatment called brachytherapy, in which tiny radioactive sources are inserted into the tumor. And Fischer says his lab is investigating many other medical applications for their robots, including procedures involving the liver, kidneys, and even the brain.

Last year, the lab received a $3.8 million grant from the NIH to work on an MRI-compatible neurosurgery tool that could be used to burn away brain tumors or place electrodes for deep brain stimulation. The approach could help surgeons reach precisely the right target by letting them use MRI for guidance. Su says that’s a critical need: “When surgeons open up the skull, the brain can expand 2 centimeters.” The robot could also allow surgeons to perform interventions that are less invasive, particularly in brain areas that are difficult to reach.

A few other research groups are working on MRI-compatible robotic tools—including, surprisingly, the space tech company MDA Corporation, which has built robotic arms for the International Space Station and various Mars rovers. Tim Fielding, an MDA senior systems engineer who directs work on the company’s new medical tools, says space and medical robotics have one critical thing in common. “Both applications require high reliability and safely,” he says. “Whether you’re operating near a space station or a human body, there’s a high-value object to protect.”

The MDA team has developed an MRI-compatible robotic tool that is currently being tested in a clinical study of breast biopsies, and prostate biopsies may soon follow. Fielding says this tool, which the surgeon uses to both position and insert the needles based on real-time MRI images, is “automatic as opposed to autonomous.” The tool doesn’t make decisions, he says, but is simply carrying out the doctor’s instructions. 

“Right now there’s a human in the loop,” says Fielding. “I envision a time when MRI could be used to close that loop and improve positioning, but we want to be careful how much we delegate.” 

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman

“I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

This article is part of our special report on AI, “The Great AI Reckoning.”

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓ Show less