Why You Want Your Drone to Have Emotions

Image: Stanford University
Example of three different flight paths to reflect different emotional states of the drone. Each personality profile is represented by a color: Adventurer Hero (Red), Anti-Social (Blue), and Exhausted (White).

There’s been a lot of research on how humans interact with robots. In fact, there’s a whole field for it called HRI (Human-Robot Interaction), with its own flagship conference (that IEEE co-sponsors) going on right now in New Zealand. The majority of the research in this field focuses on how humans interact with social robots, including home robots, commercial robots, and educational robots and toys, but odds are, if you personally own a robot, it’s going to be either a vacuum or a drone.

As drones have become more and more pervasive over the last few years, HRI research on them has been expanding. The latest contribution to this area is a fascinating paper being presented at the HRI conference on “Emotion Encoding in Human-Drone Interaction.” In other words, how you can program a recognizable personality into a drone.

Why would anyone want a drone with the ability to express emotions? Emotional expression is, essentially, a way of communicating information. You could communicate something like “I am tired” to other people by telling them, or you could do it by acting tired: moving slowly, yawning a lot, and closing your eyes. Depending on the situation, expressing your tiredness through actions might be more effective than just saying it, like if you don’t want to be noisy about it, or if you need to communicate with someone who doesn’t speak your language.

Robots work in much the same way, in cases like these. A robot could express the fact that it’s low on battery by showing its battery life on a display, but doing so is contingent on the user looking at and understanding the display. If, on the other hand, the robot starts moving sluggishly and making yawning noises, it’s very easy for an untrained user to identify a state of tiredness.

For drones, there are all kinds of ways in which emotional expressions like these could be useful. Tiredness (sluggish movement or latency in responding to commands) to indicate low battery is a straightforward one. There’s also fear: a drone could look “scared” when it’s been commanded to fly outside of the range of its controller. Or if it receives a command that it doesn’t understand, it could look “confused.” Again, the drone could communicate these states more directly by using a display on a controller, but especially for a flying robot that you want to keep your eyes on, exhibiting emotions in this way could be an effective form of communication.

Researchers from Stanford University, led by Dr. Jessica Cauchard, have established an “emotional model space” for drones, which consists of a set of eight emotional states (personalities) that have defining characteristics that can be easily recognized by human users, and that can be accurately represented through simple actions that the drone can perform. These personalities include: brave, dopey, sleepy, grumpy, happy, sad, scared, and shy. For example, a drone with a brave personality moves quickly and smoothly, and if you ask it to go backwards, it’ll instead turn around and go forwards. A dopey drone flies a little wobbly. A grumpy drone may require you to repeat commands, while a sad drone flies low to the ground.

The researchers took these eight personality types for drones, and distilled them down to just four: the Exhausted Drone, the Anti-Social Drone, the Adventurer Hero Drone, and the Sneaky Spy Drone. For the initial testing in this paper, they decided not to use the Sneaky Spy Drone, leaving them with three personality profiles. This table shows how those personality profiles manifested themselves in the drones’ behavior:

img

And here’s a little bit of video showing how the study worked in practice:

These are only very, very basic control parameters to differentiate the drones’ personalities to users in the study. Each personality has an entire interaction profile that could be used to develop emotionally expressive behaviors for it. Compare the interaction profiles for the Adventurer Hero Drone (a combination of happy and brave) with the Exhausted Drone (a combination of dopey, sleepy, and sad):

img

The study itself asked participants to interact with a drone, and then had them answer questions about the experience. Generally, users were able to tell what personality the drone was exhibiting (the average recognition rate was 60 percent), although the Adventurer Hero personality was recognized 100 percent of the time. All participants were able to identify changes in the drones’ behavior, suggesting that people will have no problems noticing drone movements designed to convey specific emotions, and it’s fascinating to read what emotions users themselves ascribed to the drones. Here are some samples:

Exhausted Drone

“Sharp movements but not always very coordinated, seems incompliant but bold.”

“Slow to respond, disobedient, could be because it’s mad or stupid.”

Anti-Social Drone

“The drone didn’t seem to ‘get it’. Just kind of moped around.”

“At first I was thinking sad/low energy because it took multiple commands every time and it kept flying so low and dropping. But at the end, when it was disobeying by not stopping, I figured it was capable but reluctant.”

Adventurer Hero Drone

“It flew a lot with its nose down so it seemed to me to signal bravery.”

“Extra movements made it look like the drone couldn’t contain its excitement.”

Many of the study participants compared the Adventurer Hero Drone to a pet dog, which seems like a positive step towards helping drones make more of an emotional connection with their users. As the researchers point out, drones with personalities “would become more interesting objects to interact with and the stereotype of personality model could bring more realism to the interaction, facilitating their acceptability in personal spaces.” Imagine a drone that you could go jogging with that would show excitement to leave the house with you, show happiness if you were to run a little farther or a little faster, and then share in your exhaustion when you made it back home. The researchers are actively working on this kind of thing, and we’re excited to see what they come up with.

“Emotion Encoding in Human-Drone Interaction,” by Jessica R. Cauchard, Kevin Y. Zhai, Marco Spadafora, and James A. Landay was presented this week at HRI 2016 in Christchurch, New Zealand.

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, drones, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
 
Senior Writer
Evan Ackerman
 
 
Contributor
Jason Falconer
Contributor
Angelica Lim
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement