Telepresence Robots Are Helping Take Pressure Off Hospital Staff

Ava Robotics’ autonomous telepresence robots are helping doctors see more COVID-19 patients while avoiding infection

5 min read
Ava telepresence robot at a lab
Photo: Ava Robotics

Robots are hard at work in hospitals doing what they can to help us get through the COVID-19 pandemic. In addition to sterilizing rooms and delivering medication and supplies, robots are helping hospital staff to work much safer and more efficiently through telepresence.

iRobot spinoff Ava Robotics has telepresence robots deployed in hospitals, where they’re enabling doctors quarantined at home to stay present in emergency rooms and allowing nurses to see patients without having to worry about personal protective equipment. The robots are lowering the risk of infection while making it possible for hospital staff to stay productive even if they’re at home or under quarantine themselves, adding a little extra slack to an overstressed system.

Ava Robotics was founded in 2016 by folks from iRobot, and the concept for the robot itself goes back at least a decade. iRobot announced a healthcare robotics division in 2009, introduced a demonstrator platform called Ava at CES in 2011, and then used the technology in Ava (fully autonomous navigation combined with robust, high quality telepresence) to develop both a telemedicine system (in partnership with InTouch Health) called RP-VITA, and a more traditional telepresence robot (in partnership with Cisco) called the Ava 500. With iRobot itself focusing primarily on consumer robots over the past few years rather than commercial or enterprise robots, Ava Robotics took all of that Ava tech to build a separate business with it.

While Ava is a capable general telepresence robot (and there are plenty of them deployed in an enterprise capacity), there are several features that make Ava uniquely qualified for work in hospitals, reflecting its history as a healthcare platform. Firstly, the robot is fully autonomous, using depth cameras and lidar to navigate by itself while avoiding static and dynamic obstacles. Ava relies on a HIPAA-certified communications system from Cisco to make sure that conversations between doctors and patients are private. And the entire robot is very easy to clean and disinfect, and can be sterilized just like any other surface in a hospital.

Since hospital staff are, understandably, far too busy to tell us how they feel about their robot colleagues right now, we spoke with Youssef Saleh, founder and CEO at Ava Robotics, to learn more about how hospitals are using Ava’s robots. While we’ve been requested to keep the deployment sites anonymous, it’s important to understand that we’re not talking about a demo or a pilot project here—these robots are out there, in hospitals, being as useful as they can.

IEEE Spectrum: How are your robots being used in hospitals right now?

Youssef Saleh: The core of what we do is the ability to “teleport” people to places without them physically being there. During COVID-19, our robots are being used today in a variety of applications in the healthcare system, from hospitals to elder care, and we’ve been getting direct feedback from the doctors who are using Ava. I’ll try to be as transparent as possible here in terms of what they’re doing and what they’re thinking of—what other applications they’re considering.

In one of the hospitals, they moved the robot into one of the ER rooms to be able to bring doctors in to participate in diagnosis and treatment from wherever they are. That could be either because some of the key right doctors are working from different locations, or because it’s after their shift. And we’ve seen scenarios where doctors are quarantined because they’ve had direct contact with a COVID-19 patient or colleague, so that even though they’re healthy, they’re at home, but with Ava, they can still do some of what they need to do.

“In one of the hospitals, they moved the robot into one of the ER rooms to be able to bring doctors in to participate in diagnosis and treatment from wherever they are. That could be either because some of the key right doctors are working from different locations, or because it’s after their shift. And we’ve seen scenarios where doctors are quarantined because they’ve had direct contact with a COVID-19 patient or colleague”

Another application—that we hadn’t thought of—was using Ava to help with the shortage of personal protective equipment. Every time a doctor or a nurse needs to walk into a room with a COVID patient, they have to put on all this protective gear. It might be just going in for a minute or two, and then they come and they have to dispose of it, and then go through that cycle again and again. So using Ava to minimize the number of times you have to go in and out of a room to when you absolutely need to has been a powerful capability.

One other application—that again there’s no way we could have thought of—that a doctor shared with us is when they have to go into a room with a COVID-19 patient and they have to put on all of the protective equipment, they explained that they can use the robot to assess the patient using the robot while they’re in a different room putting on their protective equipment to save time.

Doctors have talked to us about doing testing and triage using the robot, but I don’t know how many of those use cases are being used operationally, while the robots in the ER and the robots helping nurses avoid going in and out of rooms are being used every day. Hospitals are also considering enabling loved ones to remotely visit patients using the robot.

Last but not least, hospitals are using the robots to deal with the shortages of staff or personnel, since now you could bring them in from anywhere, even if they’re quarantined themselves.

How important is Ava’s ability to navigate autonomously?

It’s huge. These people are incredibly busy. No one wants to be driving a robot. They’re jumping from one thing to another. They want the robot in a specific room, and they can do something else until it gets there, and then they can go in and have a dialogue or what have you. And when they’re done, they’re not going to have to deal with driving this robot back to wherever they need it to go. That’s our teleporting concept—you need to be someplace, you magically show up there, and when you’re done you disappear and the robot takes care of itself. And in addition, there’s the safety aspect. The robot isn’t going to bump into things, it’s not going to fall down the stairs.

Are there things that hospitals want the robot to do that it can’t do right now, or new capabilities and use cases that you’re working towards?

There are certain applications that they’re doing, but they could do them better or more effectively if the robot could do more things. For example, they wanted to do the triage and the initial testing, so there would be another very strong application if the robot could do the swab.

We’re also working on the coordination of multiple robots based on feedback we’ve gotten; there is a lot of other equipment in an emergency room, and if you have five doctors going in there with five different robots, it could get crowded. Being able to work in environments with a lot of noise can be a challenge, being able to focus on only the audio that you want. None of these have been hindrances, and we’ve been getting incredibly positive feedback considering how quickly we’ve been deploying. But in terms of enhancing the experience, this is the kind of feedback we’ve been hearing from doctors.

Once things have stabilized, we’ll be following up directly with hospital staff to get their perspective on how telepresence robots like Ava made a difference to their work.

[ Ava Robotics ]

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman
LightGreen

“I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

This article is part of our special report on AI, “The Great AI Reckoning.”

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓ Show less