Robot Outperforms a Surgeon in a Precision Training Task

The feat brings us one step closer to fully automated surgeries

3 min read
Robotic arms with surgical tools approach a red board with pegs.

Minho Hwang and colleagues used the da Vinci Research Kit robot, trained using AI, to complete the automated peg-transfer task. The blocks, pegs, and pegboard are meant to simulate a surgical setting.

Ken Goldberg

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Who is better at performing surgery: an experienced surgeon or a robot?

Typically, surgeons have to make incisions that are relatively big during surgery, whereas the small instruments of a robot can fit through smaller incisions. Given this advantage of robotic systems, it’s now quite common for surgeons to use remote-controlled robotic arms to perform surgery—combining the precision of an experienced human with the minimal invasiveness of a small robotic arm. Nevertheless, the surgeon is controlling the robot in these cases, and a fully automated robotic system that can outperform surgeons in terms of precision is yet to be realized.

A recent advance shows that robots could surpass human performance in the near future, however. In a paper published 10 May in IEEE Transactions on Automation Science and Engineering, a multinational team of researchers reported the results of a study where a robot was able to complete a common surgery training task with the same accuracy as an experienced surgeon, while completing the task more quickly and consistently.

Minho Hwang, an assistant professor at the Daegu Gyeongbuk Institute of Science and Technology, in South Korea, was involved in the study. He notes that many robotic systems currently rely on automated control of cables, which are subjected to friction, cable coupling, and stretch—all of which can make precision positioning difficult.

“When humans control the robots, they can compensate through human visual feedback,” explains Hwang. “But automation of robot-assisted surgery is very difficult due to [these] position errors.”

In their study, Hwang and colleagues took a standard da Vinci robotic system, which is a common model usedd for robot-assisted surgery, and strategically placed 3-D printed markers on its robotic arm. This allowed the team to track its movements using a color and depth sensor. They then analyzed the arm’s movements using a machine-learning algorithm. Results suggest that the trained model can reduce the mean tracking error by 78 percent, from 2.96 millimeters to 0.65 mm.

Next, the researchers put their system to the test against an experienced surgeon who had performed more than 900 surgeries, as well as against nine volunteers with no surgical experience. The study participants were asked to complete a peg transfer task, which is a standardized test for training surgeons that involves transferring six triangular blocks from one side of a pegboard to the other and then back again. While the task sounds simple enough, it requires millimeter precision.

The study participants completed three different variations of the peg task using the da Vinci system: unilateral (using one arm to move a peg), bilateral (using two arms to simultaneously move two pegs) and bilateral with a crossover (using one arm to pick up the peg, transfer it to the other arm, and place the peg on the board). Their robot-assisted performance was compared to that of the fully automated robotic system designed by Hwang’s team.

Hwang et al show how their robot system can outperform a surgeon in a precision training task. In the most difficult variation of the task, called a bilateral handover, the robot must pick up a peg, transfer it to the other robotic arm, and then place the peg back on the peg board, all with millimeter precision. The robot outperforms the surgeon by 31.7 % in mean transfer time.

Using one arm, the surgeon outperformed the automated robot in terms of speed. But in the more complex tasks involving two arms, the robot outperformed the surgeon.

For example, for the most difficult task (bilateral handovers), the surgeon achieved a success rate of 100 percent with a mean transfer time of 7.9 seconds. The robot had the same success rate, but with a mean transfer time of just 6.0 seconds.

“We were very surprised by the robot’s speed and accuracy, as it is very hard to surpass the skill of a trained human surgeon,” says Ken Goldberg, a professor in the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley, who was also involved in the study. “We were also surprised by how consistent the robot was; it transferred 120 blocks flawlessly without a single failure.”

Goldberg and Hwang note that this is a preliminary study in a controlled environment, and more studies are still needed to achieve fully automated robotic surgery. But as far as they are aware, this is the first instance of a robot outperforming a human in a surgery-related training task.

“We have demonstrated that fast and accurate automation is feasible for one surgical task involving rigid objects of known shape. The next step is to demonstrate this for other tasks and in the much more complex environment of a human body,” says Hwang.

He says that, in future work, the team plans to extend their approach to automating surgical subtasks such as tissue suturing, and that they would like to build upon their methods for calibration, motion planning, visual servoing, and error recovery.

This article appears in the October 2022 print issue as “Robot Bests Surgeon in Precision Task.”

The Conversation (0)

Are You Ready for Workplace Brain Scanning?

Extracting and using brain data will make workers happier and more productive, backers say

11 min read
Vertical
A photo collage showing a man wearing a eeg headset while looking at a computer screen.
Nadia Radic
DarkGray

Get ready: Neurotechnology is coming to the workplace. Neural sensors are now reliable and affordable enough to support commercial pilot projects that extract productivity-enhancing data from workers’ brains. These projects aren’t confined to specialized workplaces; they’re also happening in offices, factories, farms, and airports. The companies and people behind these neurotech devices are certain that they will improve our lives. But there are serious questions about whether work should be organized around certain functions of the brain, rather than the person as a whole.

To be clear, the kind of neurotech that’s currently available is nowhere close to reading minds. Sensors detect electrical activity across different areas of the brain, and the patterns in that activity can be broadly correlated with different feelings or physiological responses, such as stress, focus, or a reaction to external stimuli. These data can be exploited to make workers more efficient—and, proponents of the technology say, to make them happier. Two of the most interesting innovators in this field are the Israel-based startup InnerEye, which aims to give workers superhuman abilities, and Emotiv, a Silicon Valley neurotech company that’s bringing a brain-tracking wearable to office workers, including those working remotely.

Keep Reading ↓Show less
{"imageShortcodeIds":[]}