Georgia Tech Robot Masters the Art of Opening Doors and Drawers

Georgia Tech researchers have programmed a robot to autonomously approach and open doors, drawers, and cabinets

3 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Georgia Tech Robot Masters the Art of Opening Doors and Drawers

To be useful in human environments, robots must be able to do things that people do on a daily basis -- things like opening doors, drawers, and cabinets. We perform those actions effortlessly, but getting a robot to do the same is another story. Now Georgia Tech researchers have come up with a promising approach.

Professor Charlie Kemp and Advait Jain at Georgia Tech's Healthcare Robotics Laboratory have programmed a robot to autonomously approach and open doors and drawers. It does that using omni-directional wheels and compliant arms, and the only information it needs is the location and orientation of the handles.

The researchers discussed their results yesterday at the IEEE International Conference on Robotics and Automation, in Anchorage, Alaska, where they presented a paper, "Pulling Open Doors and Drawers: Coordinating an Omni-Directional Base and a Compliant Arm with Equilibrium Point Control."

One of the neat things about their method is that the robot is not stationary while opening the door or drawer. "While pulling on the handle," they write in their paper, "the robot haptically infers the mechanism's kinematics in order to adapt the motion of its base and arm."

In other words, most researchers trying to make robots open doors, cabinets, and similar things rely on a simple approach: keep the robot's base in place and move its arms to perform the task. It's easier to do -- and in fact that's how most robot manipulation   but limits the kinds of tasks a robot could accomplish.

The Georgia Tech researchers allow their robot to move its omni-directional base while simultaneously pulling things open -- an approach they say improves the performance of the task.

There's no better way to understand it than seeing the robot in action:

So how did they do it?

First, a look at their robot. According to Travis Deyle, a researcher at the Healthcare Robotics Lab who first reported on the new robot and its capabilities at Hizook, the robot is called Cody [photo, right]. It consists of a Segway RMP 50 Omni base with Mecanum wheels, a vertical linear actuator to raise the robot's torso up to 1.2 meter above the ground, a laser range finder, and a pair of 7-DOF MEKA Robotics arms.

A Mac Mini running Linux performs all the computation for the sensing and high-level control. Another computer running a Linux-based real time system controls the MEKA arms. The researchers wrote all their software in Python and used open source packages like ROBOOP and ROS.

The robot uses a simple hook as its end effector, which the researchers built with a 3D printer and coated with rubber to increase friction. At the wrist, a 6-axis force sensor measures the forces on the hook, which was based the way a person uses a finger to pull something open [photo below].

But the most innovative thing is the control method they implemented, which they call equilibrium point control, or EPC. Here's the gist. Rather than model the dynamics of the arm and the impedance at the end effector or use inverse dynamics, the researchers created a control system that relies on simulated visco-elastic springs at the robot's joints. The EPC system uses these virtual springs, whose stiffness can be adjusted, to determine how the joints should move to achieve a desired movement.

Kemp and Jain say that this approach, combined with the robot's low mechanical impedance (which reduces the forces resulting from contact and thus minimizes the risks of damage to the robot, objects, and people), proved "easy to work with, easy to implement, and surprisingly effective."

They tested their approach with 10 different doors and drawers, reporting that the robot succeeded in 37 out of 40 trials. What's more, the robot was able to open doors and drawers from initial positions that would be difficult for static robots to succeed at the task.

They write: "We empirically demonstrate that the system is robust to common forms of task variation, including variation in the mechanism being operated (tested with 7 doors and 3 drawers), and variation in the pose of the robot's base with respect to the handle."

I think that's researchspeak for "It works!"

Images and video: Georgia Tech's Healthcare Robotics Lab

The Conversation (0)