Robots for Real: Surgeons and Robots Scrub Up

At Johns Hopkins University, doctors and engineers collaborate to create the next generation of robots for the operating room

4 min read

This segment is part of "Engineers of the New Millennium: Robots for Real."

In this special report, we meet some of the world’s most creative minds in robotics to find out how their robots will transform our lives—for real. “Engineers of the New Millennium: Robots for Real,” a coproduction of IEEE Spectrum magazine and the National Science Foundation’s Directorate for Engineering, aired on public radio stations across the United States.

Hosted by Susan Hassler and Ken Goldberg
Senior editor: Erico Guizzo

Robots for Real: Surgeons and Robots Scrub Up


TRANSCRIPT:

Reported by Laurie Howell

Jim Handa: We should give them a gold star...oh, this is just, this is just awesome...yeah...what's it doing? It's giving a force.

Laurie Howell: Jim Handa is an eye surgeon who gets to see into the future. He collaborates with this engineering research center to help them test out new medical robots. The Steady-Hand Eye Robot looks sort of like a metal pencil attached to a large microscope. It hovers over a model eye, which can also be seen in 3-D on the attached computer monitor. Today Handa is testing out the latest upgrade. Now, when he uses the surgical tool, sound cues, as well as visuals, will help guide him.

Jim Handa: Three, three, two, two, three, four...oh, that's really useful, actually...okay...three, two, one, one...because if you knew exactly the force that's being generated and if you knew the force that would induce a hemorrhage or a tear to the tissue, then you have instantaneous information. Right.

Laurie Howell: Handa's feedback is exactly what the robotics team needs to perfect the Steady-Hand Eye Robot, which addresses major challenges in retinal microsurgery. The robot cancels out human hand tremor. It is a part of a microsurgical workstation that combines robotics, sensing, and imaging to help the surgeon work on tissue that is so small the surgical manipulations can't even be felt by the human hand.

Jim Handa: I think there are some elements that make us excited about it because of the ability to use robotics and other instruments to go to different structures within the eye that we are currently unable to adequately operate on.

Laurie Howell: Handa hopes to use the Eye Robot in human clinical trials within a few years. It could open up a whole new branch of eye surgery, which is part of the mission, according to center director Russ Taylor, who helped design one of the first surgical robots nearly 20 years ago.

Russ Taylor: What we're trying to do is to understand a real problem that a surgeon has and then find a way to solve that problem.

Russ Taylor: To make a surgical intervention more accurate, more precise, safer, less invasive, or even just the ability to do it at all.

Laurie Howell: The Engineering Research Center for Computer-Integrated Surgical Systems and Technology was launched in 1997. The long title reflects a big agenda: create a paradigm shift in the field of medicine. So, from the beginning, the team wanted to design whole systems. The center's deputy director, Greg Hager, is in charge of the Eye Robot's ability to acquire, process, and display information in real time.

Greg Hager: We think here that the real impact is when you put it together—when you provide sensing, visualization, information processing, and devices and tools that are all integrated into what we call a surgical workstation. That's when you have the true impact. It's about making the eyes better, it's about making the hands better, making the eyes plus the hands better to really enhance surgery.

Laurie Howell: One of the center's first inventions, the MR robot, is now in human clinical trials at National Institutes of Health, diagnosing prostate cancer. Electrical engineer and computer scientist Gabor Fichtinger helped develop the project.

Gabor Fichtinger: We literally conceptualized this device on a napkin at a reception, so it's a proverbial napkin, but it happens to be true.

Gabor Fichtinger: We put it down on a piece of paper not bigger than my palm, and in 22 months we did our first patient.

Laurie Howell: Humble beginnings for a robot that manages to place a needle immediately and precisely when the MRI scan reveals possible cancer. One of the robot's designers, mechanical engineer Louis Whitcomb, says it's a powerful combination because the MRI shows human tissue in unprecedented detail.

Louis Whitcomb: Our project, which is what we call the MR robot project, was to develop a robot that was specifically designed to do needle placement in the prostate under MR guidance. So, take the technology of robotics—robotics are very precise and they're like a machine tool, and if you can tell them exactly where to place the needle, they'll place it with millimeter accuracy.

Gabor Fichtinger: This could be a perfect implementation of the so-called one-stop-shopping paradigm when you're going for imaging and immediately you get your biopsy. You don't have to come back for a second session, and you don't have to spend another 2 hours in the MRI scanner. We just do it right on the spot.

Laurie Howell: The next 20 years in medical robotics has to do as much with information as it does with robots. As technology improves treatment capabilities and computers track the experiences, doctors can use the information to improve patient care, according to the center's director, Russ Taylor.

Russ Taylor: Engineers and engineering faculty and students love to solve real problems that can make a difference in the world, and I think that's really motivated us.

Laurie Howell: I'm Laurie Howell.

The Conversation (0)