Soldiers Can Get Emotionally Attached to Robots, and That May Not Be a Good Thing

When a robot is your friend, is it harder to send it into harm's way?

3 min read
Soldiers Can Get Emotionally Attached to Robots, and That May Not Be a Good Thing

Above: The remains of Packbot #129, killed in the line of duty. Now in iRobot's corporate museum.

Humans developing emotional attachments to robots is well known, and well documented. Heck, we love every single robot we've ever met. Things get more complicated, however, when robot interaction is a job and not a hobby, and especially when the robots that you're working with are (to some extent) designed specifically to get blown to smithereens.

Julie Carpenter, from the University of Washington, recently published a thesis entitled "The Quiet Professional: An Investigation of U.S. Military Explosive Ordnance Disposal Personnel Interactions With Everyday Field Robots." The investigation is whether or not having an emotional attachment to an EOD (explosive ordinance disposal) robot might influence the decisions made by the robot's operator to the extent that it could alter the outcome of a mission.

What Carpenter found is that troops’ relationships with robots continue to evolve as the technology changes. Soldiers told her that attachment to their robots didn’t affect their performance, yet acknowledged they felt a range of emotions such as frustration, anger and even sadness when their field robot was destroyed. That makes Carpenter wonder whether outcomes on the battlefield could potentially be compromised by human-robot attachment, or the feeling of self-extension into the robot described by some operators.

“They were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet,” Carpenter said.

Over on Reddit, a commenter called "mastersterling" related an experience that he had with EOD robots in Iraq. Possibly NSFW for mildly colorful expression:

While clearing an IED on a bridge, my Talon decided to make left face and head directly for the only gap in the guard rail on the entire bridge. My team member started screaming that he had lost comms with the bot and was frantically powering down the system in an attempt to stop the bot from plunging in to the Tigris. The bot stopped with about 25-35% of its length hanging over the river, and the only thing keeping it from going over was the weight of the water bottle charge it was holding in its gripper. We sent our little Packbot (already named Danny DeVito) down to drag the Talon back to safety, which it could not do. So we had the Packbot hook a rope to the handle and pulled it back to the middle of the road manually while Danny went down and manually ripped the [IED] apart. This was around the time Owen Wilson tried to kill himself, so we ended up calling our Talon Owen Wilson.


Some of the grunts I worked with lost a MARCbot and they awarded him a Purple Heart, BSM, and they did a full burial detail with 21 gun salute at Taji. Some people got upset about it but those little bastards can develop a personality, and they save so many lives.

Potential issues that might arise in a situation like this is if the EOD team were to (say) form a stronger emotional attachment with one of their robots than another, and then be reluctant to use the most appropriate robot in a particularly risky situation. We're not saying that this is something that happens right now (and according to Carpenter, soldiers say that it definitely doesn't), but it's possible that as robots get more interactive and sophisticated, it could become more of an issue.

So what now? Carpenter suggests that the next generation of EOD robots should take factors like this into consideration when they're designed. In other words, the robots should be designed to have less personality, and be more like tools, so that they're harder to form relationships with. I'm a little bit skeptical, however, that it's going to be possible to do this effectively, simply because humans seem to have the capability of imbuing just about any inanimate object with personality, even if it doesn't drive around, occasionally do weird things, and sometimes get blown up in your place.

[ UW ] via [ PBS ]

Thanks Will!

The Conversation (0)

How Robots Can Help Us Act and Feel Younger

Toyota’s Gill Pratt on enhancing independence in old age

10 min read
An illustration of a woman making a salad with robotic arms around her holding vegetables and other salad ingredients.
Dan Page

By 2050, the global population aged 65 or more will be nearly double what it is today. The number of people over the age of 80 will triple, approaching half a billion. Supporting an aging population is a worldwide concern, but this demographic shift is especially pronounced in Japan, where more than a third of Japanese will be 65 or older by midcentury.

Toyota Research Institute (TRI), which was established by Toyota Motor Corp. in 2015 to explore autonomous cars, robotics, and “human amplification technologies,” has also been focusing a significant portion of its research on ways to help older people maintain their health, happiness, and independence as long as possible. While an important goal in itself, improving self-sufficiency for the elderly also reduces the amount of support they need from society more broadly. And without technological help, sustaining this population in an effective and dignified manner will grow increasingly difficult—first in Japan, but globally soon after.

Keep Reading ↓Show less