One of the holy grails of robotic surgery is the ability to perform minimally invasive procedures guided by real-time scans from a magnetic resonance imaging, or MRI, machine. The problem is the space inside MRI scanners is tight for a person, let alone a person and a robot. What’s more, these machines use very strong magnetic fields, so metal is not a good thing to place inside of them, a restriction that is certainly a problem for robots.
Now researchers at Worcester Polytechnic Institute (WPI) are developing a MRI-compatible robotic surgery tool that can overcome those limitations. Their system isn’t made of metal, but instead has plastic parts and ceramic piezoelectric motors that allow it to work safely inside an MRI.
The tool is now being tested on human patients undergoing prostate biopsies at Boston’s Brigham and Women’s Hospital. The radiologists can use real-time MRI images to guide the movement of their robotic assistant, which they believe will provide unprecedented accuracy. (If you’re wondering how it works, here’s a descriptive line from a journal article the researchers published last year: “The patient lies inside the MRI scanner . . . and the robot accesses the prostate through the perineal wall.”)
The robot, developed by WPI in collaboration with Brigham and Johns Hopkins University, also boasts a low-noise control system that doesn’t cause electrical interference. “Essentially, we made a device that can move around the MRI bore without affecting image quality,” says Gregory Fischer, a professor of mechanical engineering at WPI whose Automation and Interventional Medicine Robotics Lab led the research.
So far, a dozen men have participated in a clinical trial assessing the feasibility and safety of robot-assisted prostate biopsies. The count should reach 20 by the end of July.
The typical biopsy that doctors perform to check a man for prostate cancer is far from a precise procedure, says the trial’s principle investigator Clare Tempany, a radiologist and director of the National Center for Image Guided Therapy at Brigham. Typically a physician targets the prostate, which is the size of “a small peach or plum,” she says, by placing a grid guide between the patient’s legs. Then the doctor inserts needles through the skin and into each quadrant to get tissue samples. “This is somewhat disparagingly called ‘the blind biopsy,’ ” says Tempany. “There’s no lesion targeted, it’s just: ‘Let’s push a bunch of needles in and see what we get.’ ”
These biopsies are sometimes aided by ultrasound imaging, which requires inserting a ultrasound wand into the patient’s rectum. Because ultrasound doesn’t provide clear enough images, doctors aren’t able to make precision strikes with their needles, says Tempany. They often end up taking 10 to 50 core samples, she says, and each needle stick carries the risk of infection.
This clinical trial literally shows a better way. Looking at real-time MRI images, the doctor can identify parts of the prostate that look suspicious, and direct the robotic tool to those spots. “In our procedure, a small little robot places the needle on the skin’s surface and says, ‘This is the spot where you need to push, and in five or six centimeters you’ll hit your target,’ ” Tempany explains. In the current trial the doctor takes the actual step of inserting the needle, and typically takes samples in just four locations. “It’s a smarter biopsy,” she says.
Fischer says one of the challenges in building this bot was ensuring that its RF emissions didn’t interfere with the magnetic and electric fields used in MRI machines. His group first experimented with pneumatic actuators to move the device. “If you use air there are no electrical signals, so it seems like it should be great,” Fischer says. But he wasn’t satisfied with the results. “It’s very hard to use air pressure to control things precisely—you get jitter and overshoot,” he says.
His team knew they could choose between a number of commercially available piezoelectric motors, which convert an electrical signal into oscillations that drive the robot. The drivers for these off-the-shelf motors are typically designed to be low-power and low-cost, says lab member Hao Su, but unfortunately they’re not low-noise. To get around this problem, Su and his colleagues replaced the typical noisy drivers with a new piezoelectric drive system that uses a direct digital synthesizer, which generates precise waveforms to drive the actuators.
The team also packed all the control electronics into a carrying case that sits beside the MRI machine, communicating with the bot inside via a fiber-optic cable. “You can bring it into any MRI room and have it up and running in an hour,” Fischer says.
In the current biopsy procedure, the doctor specifies the device’s location and also takes the crucial step of inserting the needles. With the doctor so clearly in control, you might ask: Does this tool really qualify as a robot? Fischer argues that the device does have some autonomy, as it senses where it is inside the scanner and positions its joints to move itself to the specified location.
And he adds that in lab tests they’ve proven that the system is capable of autonomously steering and inserting the needles itself based on its reading of MRI images. “It can locate the target, track the needle, and if it deflects during insertion, it can steer the needle to hit the target,” he says. However, that advanced system isn’t yet ready for clinical testing. “We’re taking baby steps to get the robot into clinical use,” he says.
Fischer and his research partners are working with Acoustic Medsystems to commercialize the device for both prostate cancer diagnostics and treatment. That company specializes in tools used in a prostate cancer treatment called brachytherapy, in which tiny radioactive sources are inserted into the tumor. And Fischer says his lab is investigating many other medical applications for their robots, including procedures involving the liver, kidneys, and even the brain.
Last year, the lab received a $3.8 million grant from the NIH to work on an MRI-compatible neurosurgery tool that could be used to burn away brain tumors or place electrodes for deep brain stimulation. The approach could help surgeons reach precisely the right target by letting them use MRI for guidance. Su says that’s a critical need: “When surgeons open up the skull, the brain can expand 2 centimeters.” The robot could also allow surgeons to perform interventions that are less invasive, particularly in brain areas that are difficult to reach.
A few other research groups are working on MRI-compatible robotic tools—including, surprisingly, the space tech company MDA Corporation, which has built robotic arms for the International Space Station and various Mars rovers. Tim Fielding, an MDA senior systems engineer who directs work on the company’s new medical tools, says space and medical robotics have one critical thing in common. “Both applications require high reliability and safely,” he says. “Whether you’re operating near a space station or a human body, there’s a high-value object to protect.”
The MDA team has developed an MRI-compatible robotic tool that is currently being tested in a clinical study of breast biopsies, and prostate biopsies may soon follow. Fielding says this tool, which the surgeon uses to both position and insert the needles based on real-time MRI images, is “automatic as opposed to autonomous.” The tool doesn’t make decisions, he says, but is simply carrying out the doctor’s instructions.
“Right now there’s a human in the loop,” says Fielding. “I envision a time when MRI could be used to close that loop and improve positioning, but we want to be careful how much we delegate.”
Senior Editor Eliza Strickland joined IEEE Spectrum in March 2011 and was initially assigned the Asia beat. She got down to business several days later when the Fukushima Daiichi nuclear disaster began. Strickland shared a Neal Award for news coverage of that catastrophe and wrote the definitive account of the accident's first 24 hours. She next moved to the biomedical engineering beat and managed Spectrum's 2015 special report, “Hacking the Human OS." That report spawned the Human OS blog about emerging technologies that are enabling a more precise and personalized kind of medicine. The blog reports on wearable sensors, big-data analytics, and neural implants that may turn us all into cyborgs. Over the years, Strickland watched as artificial intelligence (AI) technology made inroads into the biomedical space, reporting on crossovers between AI and neuroscience research and IBM Watson's ill-fated efforts in AI health care. These days she oversees Spectrum's coverage of all things AI. Strickland has reported on science and technology for nearly 20 years, writing for such publications as Discover,Nautilus, Sierra, Foreign Policy, and Wired. She holds a master's degree in journalism from Columbia University.