Automaton iconAutomaton

How Recycling Robots Could Help Us Clean the Planet


Dustbot, a garbage-collecting robot created by the Scuola Superiore Sant'Anna's CRIM Lab.
Photo: Massimo Brega

At the current rate of global population growth and consumption of resources, it appears clear to me where we're going to end: in a waste-covered Earth like that depicted in the movie WALL-E.

Needless to say recycling is one of the most important things we can do to keep our planet sustainable. I think it won't be long until governments all over the world create all kinds of incentives to improve recycling.

Which brings us to ... robots!

Recycling is a very promising area for robotics. Over the next few decades I imagine a future where waste-collecting robots will be moving through air, land, and water, reaching difficult areas to help us cleaning our environment. Picture WALL-E but before the whole planet becomes a landfill.

In fact, there are already some recycling bot prototypes roaming around. One example is Dustbot, a robot developed at the Scuola Superiore Sant'Anna's CRIM Lab, in Pisa, Italy. Led by Prof. Paolo Dario, the laboratory created a robot designed specifically to collect garbage at people's homes.

It's 1.5 meter tall, weighs 70 kilograms and can carry 80 liters or 30 kg of payload. The robot can travel at 1 meter per second and its battery gives it 16 kilometers of autonomy.


Photo: Massimo Brega

Accordingly to this BBC story the Dustbot can be summoned to your address through a mobile phone at any time of the day. Basically the machine -- built using a Segway Robot Mobility Platform -- uses a GPS system and motion sensors to drive around the city and show up at your doorstep.

Once it arrives, the user just selects the type of garbage he wants to dispose using a touch screen. A compartment opens on the robot's belly where the user places the garbage, which is them transported to a drop-off location.

The robot's greatest advantage is its size: it can navigate through narrow streets and alleys where normal garbage trucks can't go.

Here's a video showing how Dustbot -- and its "siblings" DustCart and DustClean robots -- work:


Another example is Push, a robot that patrols the streets of Disney World, asking people to feed it with rubbish. Well, it's not exactly a robot -- it's a remote-controlled garbage can. An operator drives it through the crowd, using a speaker system to talk to people, persuading them to recycle their garbage.

Watch it in action in the video below.


It's not WALL-E, but it's funny and efficient, and if it could be made truly autonomous, this simple robot -- along with an army of Dustbots and similar machines -- would be a powerful way of keeping the streets, and hopefully the planet, a bit cleaner.

Do you know of other recycling robots? Let us know.

UPDATED 04/22/10: Dustbot specs added.

Geminoid F: More Video and Photos of the Female Android

geminoid f
Photos: Osaka University (left); Osaka University and Kokoro Company (right); composite (middle).

Geminoid F, the female android recently unveiled by Hiroshi Ishiguro, a roboticist at Osaka University and ATR famous for his ultra-realistic humanlike androids, generated a lot of interest. Several people wrote me asking for more details and also more images. So here's some good news. I got some exclusive photos and video of Geminoid F, courtesy of Osaka University, ATR Intelligent Robotics and Communication Laboratories, and Kokoro Company. Below is a video I put together giving an overview of the project.


Watch in HD here.

And here are some more photos of the android. The first one below is a composite I created using the two photos right beneath it. It shows how the android's silicone body hides all the mechanical and electronics parts.

geminoid f
Composite based on photos below. Notice that the robot's body is not in the exact same position in the two images, so the composite is not a perfect match; also, I had to flip the robot skeleton image to get the right angle, creating a mirrored image that obviously doesn't correspond to reality.

geminoid f
Photos: Osaka University and Kokoro Company; Osaka University

Here's a Kokoro engineer working on the android's face. Ishiguro and Kokoro have long been collaborators, creating several humanlike androids that include the Geminoid HI-1 and Repliee Q1 and Q2.

geminoid f
Photo: Osaka University and Kokoro Company

In developing Geminoid F, Ishiguro paid particular attention to the facial expressions. He wanted an android that could exhibit a natural smile -- and also a frown.

geminoid f
Photos: Osaka University

The android is a copy of a woman in her twenties. Ishiguro told me that her identity will remain "confidential."


Photo: Osaka University

geminoid f
Photo: Osaka University

Here's Geminoid F meeting Geminoid HI-1.

geminoid f
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

geminoid f and geminoid hr-1
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

This one below shows the woman teleoperating the android. A vision system captures her mouth and head movements, reproducing those movements on the android. The woman can also use the mouse to activate certain behaviors.

geminoid f
Photo: Osaka University

So tell us: Was Ishiguro able to leap over the abyss of the uncanny valley?

Bandit, Little Dog, and More: University of Southern California Shows Off Its Robots


Bandit, a caregiving humanoid robot developed at USC's Interaction Lab

Last Thursday, I headed out to the University of Southern California campus in Los Angeles for an open house at the Center for Robotics and Embedded Systems (CRES). It was a great opportunity to see some amazing research on humanoid robots, robots learning from humans, machine learning, and biologically inspired robots. Some highlights:

Let's start at the Interaction Lab led by Dr. Maja J. Mataric, a professor of computer science, neuroscience, and pediatrics and director of CRES. Her lab focuses on human-robot interaction, specifically with the goal of developing "socially assistive systems" to help in convalescence, rehabilitation, training, education, and emergency response. (Spectrum recently ran a profile of Mataric, read here.) 

Ross Mead, a graduate student in Mataric's group, is currently working with children with autism through USC's Center for Autism Research in Engineering (CARE). Children with autism tend to interact more easily with robots than with humans. So Dr. Mataric’s group has been exploring the use of socially assistive robots in conjunction with speech processing technology to help improve social communication skills of the children.

Image courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Current results have shown improved speech and interaction skills in autistic children when presented with robots, such as their caregiving robot named Bandit. It has 6-DOF arms and a head than can pan and tilt, with a face with movable mouth and eyebrows, and stereo-cameras for eyes.

In another application, Bandit serves as a social and cognitive aid for the elderly. It will not only instruct the user to perform certain movements, but also motivate the person and ensure that each movement is performed correctly.

Below is a video of Bandit showing off USC colors and interacting with graduate student Juan Fasola (and here's a video with an overview of the project).


Video courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Another student at the Interaction Lab, Ross Mead is studying what aspects of robotic design create a more humanlike appearance and that improve acceptance of robots by humans. This has involved Sparky (below), a “minimatronic figure” developed by Walt Disney Imagineering Research and Development. The robot has 18 degrees of freedom and uses small servos and tendon-driven mechanisms to reproduce humanlike motions.

One possible application for Sparky will be as a lab tour guide. Equipped with a mobile base, it should be able to stop at various parts of the lab and describe using speech and gestures the various projects.

Watch the video below to see how Sparky uses its tendons and a spring as a spine to try to achieve natural movements:



Next up is the Computational Learning and Motor Control Lab headed by Dr. Stefan Schaal, a professor of computer science and neuroscience.

As part of the DARPA Learning Locomotion program, Schaal and his colleagues are investigating legged locomotion with the quadruped robot Little Dog developed by Boston Dynamics, whose other robots include the also quadruped Big Dog, the LS3 robot mule, and biped bot PETMAN.

Legged robots have the potential to navigate more diverse and more complex terrain than wheel-based robots, but current control algorithms hinder their application. So Schaal’s group is using Little Dog as a platform for learning locomotion in which learning algorithms developed with Little Dog will enable robots to transverse large, irregular and unexpected obstacles.

I had the opportunity to speak with Dr. Jonas Buchli and Peter Pastor of Dr. Schaal’s group following a demonstration of Little Dog. They discussed potential applications that include survivor location and recovery after a disaster, prosthetic limbs, and space exploration.

Watch the video below to see Little Dog in action (and watch this other video to see the little bot performing even more maneuvers).


Finally, at USC's iLab, Dr. Laurent Itti, a professor of computer science, is investigating how to make robots interact more naturally with humans and more effectively integrate into our lives. For that to happen, it will be important to create robots with humanlike qualities. In other words, robots will have to demonstrate humanlike locomotion, facial expressions, and eye movement. In addition, as robots gradually leave controlled environments, such as factory floors, and enter environments populated by humans, they’ll need enhanced cognitive abilities that enable them to autonomously navigate in an unstructured environment. One way of achieving that is by looking at biology.

One of the lines of research Itti and his students are pursuing involves monitoring the gaze of human participants as they watch a movie or play a video game. Such research will provide a window into how the brain functions as well as how it may become altered in diseased states. Furthermore, insights into brain function gleaned from the research has applications in machine vision, image processing, robotics, and artificial intelligence. Dr. Itti is also investigating the application of biologically inspired visual models for automatic target detection in a cluttered environment, driver alert monitoring, autonomous robotic navigation, and video games.

His group launched the Beobot 2.0 project to create an integrated and embodied artificial intelligence system and, through providing open access to their hardware and software design, enable other research groups to build other robots with diverse capabilities. Below is a picture of Beobot 2.0, and you can watch a video here to see it navigating a corridor.

beobot ilab usc
Image courtesy of Dr. Laurent Itti and USC's iLab

With the expected increase in the robot population over the next decades, robots will emerge as a prevalent force in our lives and will permeate environments beyond manufacturing and include everything from healthcare and emergency response to personal entertainment and services. While providing many benefits, robots will become part of society, raising new and unforeseen social and ethical questions that will, in effect, give us a better understanding of ourselves and what it means to be human.

In the meantime, what's my Roomba doing?

Daniel Garcia is an intern at Lux Capital and is interested in clean technology and innovations in healthcare. He holds a PhD in biomedical engineering from UCLA.

Dispatches from FIRST Robotics Championship

For those of us who couldn't make it to the FIRST finals in Atlanta this week, National Instruments is posting some cool videos straight from the competition floor. Todd, a NI staff engineer, is doing "man on the street" interviews with the teams, talking about their robots, strategies, and ... costumes. Check out his interview with the Team FTC 7 Stormtrooper guy above. NI will be posting more videos tomorrow and Saturday.

NI is also announcing today that registrations are open for the Moonbots: Google Lunar X Prize LEGO Mindstorms Challenge. From the release:

The MoonBots Challenge is an exciting contest that challenges teams of kids (age 13 and up) and adult mentors to learn about robotics, the Moon, and space exploration by designing and constructing a LEGO MINDSTORMS robot that performs simulated lunar missions similar to those required to win the $30 million Google Lunar X PRIZE. Six-member teams are engaged in hands-on learning experiences, helping to inspire today’s kids to become tomorrow’s innovators and creative problem solvers who explore futures in science and engineering. The MoonBots Challenge is free and open to teams across the globe.

For more information, go to http://www.moonbots.org and watch the video below, an interview with Steven Canvin from LEGO about the competition:

NASA's Robonaut-2 Will Go to Space This Year

NASA's Robonaut-2 (R2), a semi-humanoid robot co-developed with GM, will rocket to space on the shuttle Discovery later this year as part of NASA's final space shuttle mission. It will be the first human-like robot NASA has sent to space.

R2's dexterous hands give it the strength and flexibility to manipulate tools just like humans do, making it an ideal helper for humans in space.

Once it gets to space, R2 will be confined to the inside of the International Space Station (ISS) while astronauts test its ability to operate in zero-g. It may eventually get space-certified like its non-humanoid relative, Dextre, a two-armed dexterous manipulator developed by the Canadian Space Agency. Dextre currently assists in tasks outside the space station.

NASA engineers have less than 6 months to get R2 ready for flight, including vibration, vacuum, and radiation tests. Watch this video to see how they'll do it. 

R2 will launch on STS-133, scheduled for September, and will remain on the ISS.

Photo and video courtesy of NASA.

It's National Robotics Week! Do You Know Where Your Robot Is?

On March 9th, House Resolution 1055 -- introduced by Pennsylvania representative Mike Doyle -- passed in the House, designating the week of April 10-18 as National Robotics Week. The Congressional Robotics Caucus, with some lobbying from iRobot Corporation and other organizations, introduced the resolution to promote activities that help raise awareness of and interest in the nation's growing robotics industry.

What is the Congressional Robotics Caucus, you may ask? In between healthcare, budgeting, recessing, showing up on the Colbert Report, and all the other activities our congresscritters are busy with, the bipartisan committee works hard to understand the various fields of robotics and tries to support them through their work in Congress. Most of the reps on the committee come from states with a strong academic or business presence in robotics, like California, Massachusetts, and Pennsylvania, or places with agricultural or manufacturing industries that field a lot of robots. They receive regular briefings from experts in the field to understand what we're doing here in the US and how competitive we are on a global scale.

So here we are at National Robotics Week. Perhaps not coincidentally, this also happens to be the week of the FIRST robotics championship event in Atlanta, one of the largest gatherings of current and aspiring roboticists in the world. But if you don't happen to be there, plenty of other cities -- big and small -- around the country are celebrating with mini-competitions, classes, laboratory open houses, block parties, and more. The full calendar of events is here. You can alternatively roll your own.

There are plenty of officially recognized Days, Weeks, and Months designed to raise awareness that mean very little to most people, but I'm actually really excited about NRW. Robotics has a unique way of engaging people and getting them excited about technology that looks straight out of sci-fi, and using it as a vehicle to get kids into STEM fields and adults into learning about and supporting our industry is really effective. Check out the events in your area, encourage your friends to attend, and don't forget your shirt!

World Robot Population Reaches 8.6 Million

robot population

The world's robot population has reached 8.6 million. That's more than one automaton for every citizen of Austria!

UPDATE: Boing Boing came up with a much better comparison:

World robot population: 8.6 million

World population of high net worth individuals, as of 2009: 8.6 million

Population of New Jersey: 8.7 million

Number of Americans who participated in Pilates last year: 8.6 million

I arrived at the 8.6 million estimate based on data from the latest edition of World Robotics, a great numbers-filled report prepared annually by the International Federation of Robotics, or IFR. The report came out late last year -- I finally had time to take a look at it -- and refers to the robot market up to the end of 2008.

So how are robots doing compared to previous years?

First, some nomenclature. The study divides robots in two categories: industrial robots and service robots. The first category includes welding systems, assembly manipulators, silicon wafer handlers -- you know, that kind of big, heavy, expensive, many-degrees-of-freedom machines. The second category consists of two subcategories: professional service robots (things like bomb-disposal bots, surgical systems, milking robots) and personal service robots (vacuum cleaners, lawn mowers, all sorts of robot hobby kits and toys).

As you can see from the chart above, the number of industrial robots grew to 1.3 million in 2008 from about 1 million in 2007, and service robots grew to 7.3 million from 5.5 million. So for industrial and service robots combined it's a 32 percent increase from 2007 to 2008, and that's huge.

That said, you have to understand the numbers. The World Robotics report doesn't add up industrial and service robots. I do. The report keeps these two categories separate, I believe, because these are very different robots in terms of complexity and cost: an industrial robot can be a multimillion dollar manipulator (like the Kuka KR 1000 titan, below), whereas a service robot can be a $50 dollar toy robot.

Another reason to keep them separate: the total numbers for each category mean different things. The total of industrial robots is for "'worldwide operational stock," or robots actually operational today. On the other hand, the total of service robots consists of units sold up to the end 2008, which includes robots no longer in operation like that first-generation Roomba you harvested for parts long ago.

So why do I add the numbers? Well, because I think it's kind of cool to have a number for the world's robot population.

kuka robotics kr 1000 titanNow on to some highlights from the report. First, industrial robots.

  • According to the report, 2008 sales reached 113,000 units, which is about the same as the previous year. It's a weak result, and the culprit, as you might have guessed, is the global economic meltdown.

  • A breakdown by region. Of the 2008 robot sales, more than half, or about 60,300 units, went to Asian countries (including Australia and New Zealand). The world's largest market, Japan, continues to see a decline, with supply falling by 8 percent to about 33,100 units. But Korea and emerging markets like China and the Southeast Asian countries and India saw increases in sales, with Korea adding 11,600 robots, up 28 percent from 2007, China adding 7,900 units, an increase of 20 percent, and Taiwan's robot acquisitions surging by 40 percent.

  • In the Americas, the robot market grew by 17,200 units, or 12 percent less than in 2007. Auto industry, the main robot buyer, retreated and robot sales plunged.

  • Robot sales in Europe stagnated at about 35,100 units, with Germany taking the lead, adding 15,200 robots, 4 percent more than in 2007. Italy, Europe's second largest market after Germany, added 4,800 units and France, 2,600 robots.

  • So the total of industrial robots in 2008? First, a number that I hadn't seen before. The report says that "total accumulated yearly sales, measured since the introduction of industrial robots in industry at the end of 1960s, amounted to more than 1,970,000 units at the end of 2008." That's basically the total of industrial robots sold in the world. Ever. Cool! So to get the total of industrial robots in operation you need to remove the ones that have been taken out of service. People use different statistical models to do that, arriving at different numbers. The World Robotics report gives an estimate between 1,036,000 and 1,300,000 units.

  • Still according to the report, world industrial robot sales amounted to about US $6.2 billion in 2008. But this amount doesn't include cost of software, peripherals, and system  If you were to add that up, the market would be some three times larger, or around $19 billion.

irobot roombaNow on to service robots.

  • First, some more nomenclature. The World Robotics report differentiates between two kinds of service robots: service robots for professional use and service robots for personal use. That's because the personal ones are sold for much less and are mass produced.

  • According to the report, 63,000 service robots for professional use were sold in 2008, a market valued at $11.2 billion.

  • A breakdown by application: 30 percent (20,000 units) for defense, security, and rescue applications; 23 percent for milking robots; 9 percent for cleaning robots; 8 percent each for medical and underwater robots; 7 percent for construction and demolition robots; 6 percent for robot platforms for general use; and 5 percent for logistic systems.

  • As for service robots for personal use: 4.4 million units sold for home applications (vacuuming and lawn mowing bots) and about 2.8 million for entertainment and leisure (toy robots, hobby systems, and educational bots).

  • And here's an eye opening number: In 2008 alone about 940,000 vacuum cleaning robots (like the iRobot Roomba 562 Pet Series above) were sold, almost 50 percent more than in 2007. That's 1 million new living rooms getting cleaned by robots!

  • Finally, a forecast. The report estimates that 49,000 professional service robots and 11.6 million personal service robots will be sold between 2009 and 2012.

A note about this last bullet point. If we get this forecast and add it up to a little over 1 million industrial robots (their growth is very slow), we'd get a grand total world robot population of nearly 13 million by around 2011 or 2012. That would mean one robot for every person in Zambia. Or Illinois.

As usual, a special thanks go to the IFR statistical department folks for putting this report together.

Riding Honda's U3-X Unicycle of the Future

honda u3-x

honda u3-x It only has one wheel, but Honda's futuristic personal mobility device, called the U3-X, is no pedal-pusher. The unicycle of the future moves as you move, wheeling you to your destination simply by sensing your body tilting this way or that, Segway style.

Honda says the machine is designed for indoor use, but last week, when the company demoed it for us in New York, it worked just fine in Times Square. Watching the Honda engineers riding it around on a Broadway sidewalk was like getting a glimpse of the future.

I also got a chance to try the U3-X -- not on Broadway, but at a hotel conference room nearby. Following the instructions of the cheerful Shin-ichiro Kobashi, the U3-X lead engineer, I hopped on the seat and took a few seconds to orient myself.

After some tentative leaning to test how far I could go without falling, I got a feel for the device and found it simple to navigate. You just lean slightly in the desired direction and off you go.

It's definitely a trip. The footrests are there just for balance, not to steer. Your hands stay free, and you're perched just a little below eye level, at a height that's still natural to talk to someone standing up. Putting your feet down helps stop and turn, especially if you're rapidly approaching a wall in front of you. And the whirring of the machine is so satisfying!

Watch the video to see how easy it is to ride it:

The U3-X uses a balance control system that derives from Honda's research on human walking dynamics for its famed Asimo bipedal humanoid robot.

honda asimoWhen the rider leans his or her body, an angle tilt sensor sends data to the balance control system, which in turns moves the wheel, maintaining balance.

But the amazing thing about the U3-X is not quite visible: its omnidirectional wheel.

The wheel consists of a ring of small rubber wheels overlapping a single large wheel (see illustration below). When the large wheel rotates, the U3-X moves forward or backward. When the small wheels rotate, the machine moves left or right. And when both the large and small wheels turn at the same time, the U3-X moves diagonally.

Honda showed us animations but didn't let us take photos of the wheel itself. It's a really ingenious system that uses only two motors to accomplish all of its movement.

So how fast can it go? Its has a top speed of 6 kilometers per hour, which is a little better than the average walking speed of an adult, and the lithium-ion battery will let you ride around for an hour.

The machine weighs less than 10 kilograms (22 pounds) and max rider weight is currently 100 kg (220 lb).

honda u3-xBecause it's such a narrow device, no wider than the distance between your legs, it won't get in the way of other pedestrians or riders on crowded streets on in an office environment, Kobashi explained.

The seat folds down and the footrests fold up, so it fits in a compact package that looks a bit like a slim boombox. It's not George Jetson's foldable space car, but you can grab it by the handle and roll it around like a suitcase.

But the questions many passersby at Times Square asked were, "Where do I get one," and "How much does it cost?" Alas, Honda doesn't know that yet. Guess we'll have to wait for the future to come.

Some more photos of the device in Times Square and slides from a technical presentation the Honda engineers gave us:

honda u3-x

honda u3-x

honda u3-x

honda u3-x specs

READ ALSO:

Obama Meets Japanese Robots
Mon, November 15, 2010

Blog Post: The president was greeted by humanoid HRP-4C and caressed Paro the robotic seal

 Wii-Controlled Robotic Wheelchair
Thu, April 22, 2010

Blog Post: Japanese researchers have built a robotic wheelchair operated with a Wii game controller

Humanoid Robots Rise
Mon, October 04, 2010

Blog Post: Watch out, Asimo, these new humanoids are on your tail

Iran Unveils Humanoid Robot
Mon, August 16, 2010

Blog Post: IEEE Spectrum obtained exclusive videos showing the humanoid walking, slowly

Meet Geminoid F, a Smiling Female Android

UPDATE2: Just added more photos and a video of the android.

UPDATE: Ishiguro just told me that they won't be able to provide "any private information on the model" who served as the template for Geminoid F and that her identity will be kept "confidential."


Photo: Osaka University

Japanese roboticist Hiroshi Ishiguro unveiled today his latest creation: a female android called Geminoid F. The new robot, a copy of a woman in her twenties with long dark hair, can smile, frown, and change facial expressions more naturally than Ishiguro's previous androids.

Ishiguro, a professor at Osaka University, is famous for creating a robot replica of himself, the Geminoid HI-1, a telepresence android that he controls remotely. The new Geminoid F ("F" stands for female) is also designed to be remote controlled by a human operator.

In a press conference in Osaka, Ishiguro demonstrated how the android could mimic the facial expressions of the woman as she sat in front of a computer with cameras and face-tracking software.

Here's a video I put together:

Ishiguro built the android as part of his work at Osaka University and ATR Intelligent Robotics and Communication Laboratories, with collaboration from Kokoro Co., a Japanese firm that specializes in animatronics and ultrarealistic androids.

In designing Geminoid F, Ishiguro's team and Kokoro engineers wanted to create an android that could exhibit a wide range of natural expressions without requiring as many actuators as other androids they'd developed. In particular, they wanted the robot to sport a convincing smile -- not just any smile but, as Kokore put it, a "toothy smile." And it can also make a frown.

geminoid f
Photos: Osaka University

Whereas the Geminoid HI-1 has some 50 actuators, the new Geminoid F has just 12. What's more, the HI-1 robot requires a large external box filled with compressors and valves. With Geminoid F, the researchers embedded air servo valves and an air servo control system into its body, so the android requires only a small external compressor.

The new design helped reduce the android's cost, said Kokoro, which will sell copies of Geminoid F for about 10 million yen (US $110,000). Ishiguro and his collaborators plan to test the android in hospitals and also show it off at science museums and other venues. 

Ishiguro's previous androids, in addition to his own copy, include replicas of his then four-year-old daughter and of a Japanese TV newscaster. I couldn't find more details about the identity of the Geminoid F's master template, only that she is "one-quarter non-Japanese."

But I agree when Ishiguro says that one of the new android's advantages over his own copy (photo on the right) is that Geminoid F has a friendlier appearance and people will be more eager to interact with it. Would anyone disagree?


Photo: Makoto Ishida for IEEE Spectrum

READ ALSO:

Geminoid F Gets Job as Robot Actress
Thu, November 11, 2010

Blog Post: Is her next stop -- Broadway?

Geminoid F: More Photos of the Android
Tue, April 20, 2010

Blog Post: IEEE Spectrum obtained exclusive images and video of Hiroshi Ishiguro's new android

Who's Afraid of the Uncanny Valley?
Fri, April 02, 2010

Blog Post: To design the androids of the future, we shouldn't fear exploring the depths of the uncanny valley

The Man Who Made a Copy of Himself
April 2010

Article: A Japanese roboticist is building androids to understand humans--starting with himself

 

Who's Afraid of the Uncanny Valley?

Are you creeped out by realistic, humanlike robots?

To pay homage to the vast assortment of anthropomorphic automatons, lifelike mannequins, and CGI humans out there, IEEE Spectrum prepared a, dare we say, beautiful slideshow. Watch our Ode To the Uncanny Valley below and then tell us about your reaction.

Many people say they find such imagery eerie, creepy, scary, freaky, frightening. One explanation for such visceral reaction is that our sense of familiarity with robots increases as they become more humanlike -- but only up to a point. If lifelike appearance is approached but not attained, our reaction shifts from empathy to revulsion.

This descent into creepiness is known as the uncanny valley. It was proposed by Japanese roboticist Masahiro Mori in a 1970 paper, and has since been the subject of several studies and has gained notoriety in popular culture, with mentions in countless YouTube videos and even on a popular TV show. The uncanny valley is said to have implications for video game design and is blamed for the failure of at least one major Hollywood animation movie.

Yet it remains a controversial notion in some robotics circles. Is it a valid scientific conjecture or just pseudoscience?

There is something appealing about a simple concept that can explain something profound about our humanity and our creations. It's even more appealing when you see it as a graph (the one below is based on the Wikipedia version with some images added for fun; apparently the graph concocted by Mori was more elaborate, according to a note here).

You can see on both curves (solid line for still robots and dashed line for robots that move) how familiarity (vertical axis) increases as human likeness (horizontal axis) increases, until it plunges and then increases again -- hence the valley in uncanny valley.

As a kind of benchmark, the uncanny valley could in principle help us understand why some robots are more likable than others. In that way roboticists would be able to create better designs and leap over the creepiness chasm. But what if there's no chasm? What if you ask a lot of people in controlled experiments how they feel about a wide variety of robots and when you plot the data it doesn't add up to the uncanny valley graph? What if you can't even collect meaningful data because terms like "familiarity" and "human likeness" are too vague?

When Mori put forward the notion of the uncanny valley, he based it on assumptions and ideas he had on the topic. It was an interesting, prescient conjecture, given that there weren't that many humanoid robots around, let alone a CGI Tom Hanks. But as scientific hypotheses go, it was more speculation than a conclusion drawn from hard empirical data. This is what he wrote at the end of his 1970 paper:

Why do we humans have such a feeling of strangeness? Is this necessary? I have not yet considered it deeply, but it may be important to our self-preservation.

We must complete the map of the uncanny valley to know what is human or to establish the design methodology for creating familiar devices through robotics research.

In a recent Popular Mechanics article, writer Erik Sofge discusses some of the problems with the theory:

Despite its fame, or because of it, the uncanny valley is one of the most misunderstood and untested theories in robotics. While researching this month's cover story ("Can Robots Be Trusted?" on stands now) about the challenges facing those who design social robots, we expected to spend weeks sifting through an exhaustive supply of data related to the uncanny valley—data that anchors the pervasive, but only loosely quantified sense of dread associated with robots. Instead, we found a theory in disarray. The uncanny valley is both surprisingly complex and, as a shorthand for anything related to robots, nearly useless.

Sofge talked to some top roboticists about their views of the uncanny. Cynthia Breazeal, director of the Personal Robots Group at MIT, told him that the uncanny valley is "not a fact, it's a conjecture," and that there's "no detailed scientific evidence" to support it. David Hanson, founder of Hanson Robotics and creator of realistic robotic heads, said: "In my experience, people get used to the robots very quickly. ... As in, within minutes."

Sofge also talked to Karl MacDorman, director of the Android Science Center at Indiana University, in Indianapolis, who has long been investigating the uncanny valley. MacDorman's own view is that there's something to the idea, but it's clearly not capturing all the complexity and nuances of human-robot interaction. In fact, MacDorman believes there might be more than one uncanny valley, because many different factors -- in particular, odd combinations like a face with realistic skin and cartoonish eyes, for example -- can be disconcerting.

Hiroshi Ishiguro, a Japanese roboticist who's created some of the most striking androids, and a collaborator, Christoph Bartneck, now a professor at Eindhoven University of Technology, conducted a study a few years ago using Ishiguro's robotic copy, concluding that the uncanny valley theory is "too simplistic." Here's part of their conclusions:

The results of this study cannot confirm Mori’s hypothesis of the Uncanny Valley. The robots’ movements and their level of anthropomorphism may be complex phenomena that cannot be reduced to two factors. Movement contains social meanings that may have direct influence on the likeability of a robot. The  robot’s level of anthropomorphism does not only depend on its appearance but also on its behavior. A mechanical-looking robot with appropriate social behavior can be  anthropomorphized for different reasons than a highly human- like android. Again, Mori’s hypothesis appears to be too simplistic. 

Simple models are in general desirable, as long as they have a high explanatory power. This does not appear to be the case for Mori’s hypothesis. Instead, its popularity may be based on the explanatory escape route it offers. The Uncanny Valley can be used in attributing the users’ negative impressions to the users themselves instead of to the shortcomings of the agent or robot. If, for example, a highly realistic screen-based agent received negative ratings, then the developers could claim that their agent fell into the Uncanny Valley. That is, instead of attributing the users’ negative impressions to the agent’s possibly inappropriate social behavior, these impressions are attributed to the users. Creating highly realistic robots and agents is a very difficult task, and the negative user impressions may actually mark the frontiers of engineering. We should use them as valuable feedback to further improve the robots.

It's a good thing that researchers are trying to get to the bottom of the uncanny valley (no pun intended). Advancing the theory by finding evidence to support it, or disprove it, would be important to robotics because human-robot interaction and social robots are becoming ever more important. If we want to have robots around us, we need to find out how to make them more likable, engaging, and easier to interact with, and naturally their looks play a key role in that regard. Moreover, human-looking robots could be valuable tools in psychology and neuroscience, helping researchers study human behavior and even disorders like autism.

Ishiguro recently told me that the possibility that his creations might result in revulsion won’t stop him from "trying to build the robots of the future as I imagine them." I for one admire his conviction.

What do you think? Should we continue building robots in our image?

READ ALSO:

Geminoid F: Ultrarealistic Female Android
Tue, April 20, 2010

Blog Post: IEEE Spectrum obtained exclusive images and video of Hiroshi Ishiguro's new android

The Man Who Made a Copy of Himself
April 2010

Article: A Japanese roboticist is building androids to understand humans--starting with himself

Hubo II Humanoid Is Lighter and Faster
Tue, March 30, 2010

Blog Post: The creator of Albert Hubo is back with a new, better -- and less creepy -- humanoid robot

Robot Mimics Musculoskeletal System
Thu, March 04, 2010

Blog Post: University of Tokyo researchers are developing a humanoid that mimics the way our skeleton, muscles, and tendons work

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More