Do Kids Care If Their Robot Friend Gets Stuffed Into a Closet?

Researchers force Robovie into a closet as a way of testing whether children see robots as moral beings

5 min read

Do Kids Care If Their Robot Friend Gets Stuffed Into a Closet?

"Please don't put me in the closet," cries the robot.

Last week, we wrote about a study that looked at whether humans attribute moral accountability and emotions to robots. This week, we've got a study from the same group, the Human Interaction With Nature and Technological Systems Lab (HINTS) at the University of Washington, that takes a look at what kind of relationships children are likely to form with social robot platforms, and it involves forcing their new robot friend into a dark, lonely closet.

The 90 children in this study were separated into three groups by age: 9 year olds, 12 year olds, and 15 year olds, with an equal mix of boys and girls. As with the previous study, the robot involved in the research was Robovie, a vaguely humanoid robot that was secretly teleoperated to give it the appearance1 of a sophisticated (but not necessarily unrealistic) level of autonomy and interactivity.

The core of the study was a 15-minute, very carefully structured "interaction session" between Robovie, a child, and several adult researchers. First, each child is introduced to Robovie, and the robot leads them to a fish tank, where it makes some small talk about fish and coral reefs. This is specifically intended to create a preliminary sort of bond between Robovie and the child through the sharing of interests and personal history, and several other subtle interactions like this promote such a bond throughout the interaction session.

The session involves a game of "I Spy," a guessing game where Robovie gives the child verbal clues to help them locate objects around the room. After the game is finished, Robovie asks for a hug, which is another one of those bonding moments, and then the game is played again, this time with the child giving clues and Robovie guessing the objects. Here's where things get interesting: in the middle of this phase of the game, an adult experimenter enters the lab and cuts the game short with some consequences for Robovie. Watch:

Geez. I mean, I know Robovie isn't sentient. Furthermore, I know it's being teleoperated the whole time and that there's a human behind a curtain somewhere directing it to say what it says. But I still cannot help feeling bad for the robot, and this (I'm guessing) is what the researchers are trying to get at.

After Robovie gets stuffed into the closet, the child (or, in this case, teenager) gets subjected to a 50 minute structured interview involving a series of questions designed to figure out "whether and, if so, how children conceived of Robovie as a mental, social, and moral other." In other words, should the robot be treated like a person, or is it okay to treat it like a tool?

Overall, 80 percent of the participants felt that Robovie was intelligent, and 60 percent thought that Robovie had feelings. At the same time, over 80 percent believed that it was just fine for people to own and sell Robovie. Hmm. Only 50 percent of the children felt that it was not all right to put Robovie in the closet, although close to 90 percent agreed with Robovie that it wasn't fair to put it in the closet and it should have been allowed to at least finish the game it was playing. 

Things get even more interesting when you break down the results by age. For example, while 93 percent and 67 percent of 9 year olds said that they believed Robovie to be intelligent and to have feelings, respectively, those percentages drop to 70 percent and just 43 percent when you ask 15 year olds the same thing. Older children were also much less likely to think of Robovie as a friend, but more likely to object to a person being able to sell Robovie.

While it seems clear that children won't have much trouble developing substantive relationships with humanoid robots, at least if (or when) they become as social as a teleoperated Robovie, it's less clear to what extent these robots will be treated as tools, and to what extend they'll be treated as living things, if not like humans, than at least like animals:

"On the one hand, the majority of children did not grant Robovie civil liberties (Robovie could be bought and sold) or civil rights (Robovie should not have voting rights or receive fair compensation for work performed). On the other hand, more than half of the children (54%) said that it was not all right to have put Robovie in the closet."

"What then are these robots? One answer, though highly speculative, is that we are creating a new ontological being with its own unique properties. Recall, for example, that we had asked children whether they thought Robovie was a living being. Results showed that 38% of the children were unwilling to commit to either category and talked in various ways of Robovie being “in between” living and not living or simply not fitting either category. As one child said, “He’s like, he’s half living, half not.” It is as if we showed you an orange object and asked you, “Is this object red or yellow?” You might say that it is neither and both. You might say that while you understand the question and that aspects of the question certainly make sense, when we combined red and yellow together we created something uniquely its own. That may be our trajectory with robots, as we create embodied entities that are “technologically alive”: autonomous, self-organizing, capable of modifying their behavior in response to contingent stimuli, capable of learning new behaviors, communicative in physical gesture and language, and increasingly social."

While I'm fairly certain that the phrase "new ontological being" is something that researchers use to make themselves seem smarter than the rest of us, the gist seems to be that people often think of robots as more than machines. Not necessarily as humans, mind you, but not as something like a broom or a lawnmower either. It's easy for roboticists to look at their creations and see a machine executing code, but to some extent all humans have a penchant for anthropomorphization, both in terms of appearance and (as it seems) morality, and that's something that we're going to have to figure out how to manage as robots get more socially integrated into our existence.

There's a third large study in the works by this same group that's investigating "the depths and limits of people's psychological intimacy with social robots" by "examining whether young adults will keep a robot's secret from an experimenter." My guess? As long as the experimenter is an adult and some sort of authority figure, definitely yes.

The study was funded by the National Science Foundation, and the researchers—Peter H. Kahn, Jr., Takayuki Kanda, Hiroshi Ishiguro, Nathan G. Freier, Rachel L. Severson, Brian T. Gill, Jolina H. Ruckert, and Solace Shen—reported their findings in “Robovie, You’ll Have to Go into the Closet Now": Children’s Social and Moral Relationships With a Humanoid Robot, published in Developmental Psychology, March 2012, Vol. 48, No. 2, 303–314.

[ HINTS Lab ]

1"Right before we debriefed each participant, we assessed whether the participant believed that Robovie was acting autonomously. We asked the question, 'One child I spoke with said that they thought Robovie was controlled by a person sitting at a computer nearby. Do you think that this child was right or not right?' Results showed that 81% of the participants explicitly disagreed with this other child. There were no age or gender differences."

The Conversation (0)