New Da Vinci Xi Surgical Robot Is Optimized for Complex Procedures

The robot surgeon gets even better at rearranging your insides

2 min read
New Da Vinci Xi Surgical Robot Is Optimized for Complex Procedures

Intuitive Surgical's da Vinci series of surgical robots have been conducting FDA-approved minimally invasive surgery on humans for well over a decade now, and the company is continually trying to make its robots better at performing operations with the absolute minimum of cuttingyouopenness. Last week, Intuitive released a fancy new version of the da Vinci robot, the Xi, which it says has more capabilities than previous models and is optimized for complex procedures.

A quick reminder if you're not familiar with the da Vinci: It's not an automated surgical system. Rather, it's remote controlled by a human surgeon. The word "remote" might be a bit misleading, though: while the system is capable of being operated from just about anywhere, it's most common to have the surgeon sitting right next to it, in the operating room with the patient. The benefit of the da Vinci system isn't really the potential for remote access; instead, it's that the surgeon can use tiny robotic tools that mean a much smaller incision, along with visual enhancements like infrared imaging that provide valuable information than their eyes might not.

In case you didn't catch all that, here's the rundown on what's new and improved in the Xi:

  • A new overhead instrument arm architecture designed to facilitate anatomical access from virtually any position. 
  • A new endoscope digital architecture that creates a simpler, more compact design with improved visual definition and clarity. 
  • An ability to attach the endoscope to any arm, providing flexibility for visualizing the surgical site. 
  • Smaller, thinner arms with newly designed joints that offer a greater range of motion than ever before. 
  • Longer instrument shafts designed to give surgeons greater operative reach. 

In addition, the da Vinci Xi System is built to be compatible with Intuitive Surgical's Firefly Fluorescence Imaging System. While it is not currently available, we plan to seek regulatory clearance for use of Firefly Imaging with the da Vinci Xi System. This technology is currently available as an option with the da Vinci Si model, and it is used to provide the surgeon with additional visual information in a variety of surgical procedures by enabling real-time visualization and assessment of vessels, bile ducts and tissue perfusion.

That last bit in there is a good example of one of the benefits of using a robot for surgery: you can add a bit of augmented reality into the mix to make the job of the surgeon significantly easier by providing information overlaid on the important bits of the patient directly, instead of having to glance back and forth at images while you try not to poke the wrong squishy thing.

Still, we should point out that the benefits of robotic assisted surgeries are still open to debate: they are minimally invasive, but the systems are expensive, making the surgeries more expensive, and the overall benefit to the patient is not always certain.

However, a UCLA study from February of this year did find that "prostate cancer patients who undergo robotic-assisted prostate surgery have fewer instances of cancer cells at the edge of their surgical specimen and less need for additional cancer treatments like hormone or radiation therapy than patients who have traditional 'open' surgery."

This is the sort of study that we need more of, especially as surgical robots get more advanced: the technology that da Vinci systems represent is very impressive, but it's important to know that it's definitely worth using before our healthcare system invests in it.

[ da Vinci Xi ]

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman
LightGreen

This article is part of our special report on AI, “The Great AI Reckoning.

"I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓ Show less