Robots With Warm Skin Know What They're Touching

Giving robots warm skin can help them identify what objects are made of

3 min read
Georgia Tech warm robot skin for tactile sensing
Giving robots warm skin can help them identify what objects are made of.
Image: Georgia Tech

Usually, if your robot is warm to the touch, it’s symptomatic of some sort of horrific failure of its cooling system. Robots aren’t supposed to be warm— they’re supposed to be steely and cold. Or at least, steely and ambient temperature. Heat is almost always a byproduct that needs to be somehow accounted for and dealt with. Humans and many other non-reptiles expend a lot of energy keeping at a near-constant temperature, and as it turns out, being warmish all the time provides a lot of fringe benefits, including the ability to gather useful information about things that we touch. Now robots can have this ability, too.

Most of the touch sensors used by robots are force detectors. They can tell how hard a surface is, and sometimes what kind of texture it has. You can also add some temperature sensors into the mix to tell you whether the surface is warm or cold. However, most of the time, objects around you aren’t warm or cold, they’re ambient—whatever the temperature is around them is the temperature they are. 

Georgia Tech’s tactile robot skin uses an array of “taxels,” which the researchers built by layering piezoresistive fabric, thermistors, and a heating strip. They say the combination of force and thermal sensing works significantly better than force sensing alone.

When we humans touch ambient temperature things, we often experience them as feeling slightly warmer or colder than they really are. There are two reasons for this: The first reason is that we’re toasty warm, so we’re feeling the difference in temperature between our skin and the thing. The second reason is that we’re also feeling how much the thing is sucking up our toasty warmness. In other words, we’re measuring how quickly the thing is absorbing our body heat, and in even more other words, we’re measuring its thermal conductivity. Try it: Something metal will feel cooler to you than something fabric or wood, even if they’re both the same temperature, because the metal is more thermally conductive and is sucking the heat out of you faster. The upshot of this is that we have the ability to gather additional data about materials that we touch because our fingers are warm. 

Joshua Wade, Tapomayukh Bhattacharjee, and Professor Charlie Kemp from Georgia Tech presented a paper at an IROS workshop last month introducing a new kind of robotic skin that incorporates active heating. When combined with traditional force sensing, the active heating results in a multimodal touch sensor that helps to identify the composition of objects.

Sensing skin

Image: Georgia Tech
Georgia Tech's multimodal fabric-based tactile sensing skin prototype.

Okay, so it’s not much to look at, but the combination of force and active thermal sensing works significantly better than force sensing alone. The fabric is made of an array of “taxels,” each of which consists of resistive fabric sandwiched between two layers of conductive fabric, two passive thermistors, and two active thermistors placed on top of a carbon fiber resistive heating strip. Using all three of these sensing modalities to validate each other, the researchers were able to identify wood and aluminum by touch up to 96 percent of the time while pressing on it, or 84 percent of the time with a sliding touch.

We should mention that this isn’t the first active thermal sensor—the BioTac sensor from SynTouch also incorporates a heater, although it’s only a fingertip, as opposed to the whole-arm fabric-based tactile skin that Georgia Tech is working on. 

Tapo Bhattacharjee told us that there are plenty of different potential applications for a sensor like this. “A robot could use this skin for manipulation in cluttered or human environments. Knowing the haptic properties of the objects that a robot touches could help in devising intelligent manipulation strategies, [for example] a robot could push a soft object more than say a hard object. Or, if the robot knows it is touching a human, it can be more conservative in terms of applied forces.”

“Force and Thermal Sensing With a Fabric-Based Skin, by Joshua Wade, Tapomayukh Bhattacharjee, and Charles C. Kemp from Georgia Tech, was presented at the Workshop on Multimodal Sensor-Based Robot Control for HRI and Soft Manipulation at IROS 2016 in Seoul, South Korea.

[ Georgia Tech Healthcare Robotics Lab ]

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman

This article is part of our special report on AI, “The Great AI Reckoning.

"I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓ Show less