Automaton iconAutomaton

Ping Pong Robot Learns by Doing

Katharina Mülling (holding the emergency stop switch), Jan Peters, and Jens Kober monitor their ping pong robot practicing against a ball gun. All photos: Axel Griesch/MPG, München 

Despite all the recent advances in robotics, one fundamental task appears to remain as hard as ever: robot programming.

To be sure, robot programming in industrial settings has evolved significantly, from a series of mechanical switches to advanced programming languages and teach-pendant devices for trajectory planning. But getting robots to do their jobs still requires a great deal of human labor -- and human intelligence.

The situation is even worse when it comes to programming robots to do things in non-industrial environments. Homes, offices, and hospitals are unstructured spaces, where robots need to deal with more uncertainty and act more safely.

To overcome this programming bottleneck, engineers need to create robots that are more flexible and adaptable -- robots that, like humans, learn by doing.

That's what a team led by Dr. Jan Peters at the Robot Learning Lab, part of the Max-Planck Institute for Biological Cybernetics, in Tübingen, Germany, is trying to do. Peters wants to transform robot programming into robot learning. In other words, he wants to design robots that can learn tasks effortlessly instead of requiring people to painstakingly determine their every move.

In the video below, you can see his students taking their robot "by the hand" to teach it motor skills needed for three tasks: paddle a ball on a string, play the ball-in-a-cup game, and hit a ping pong ball. 

Here's how Dr. Peters explained to Automaton his team's approach: "Take the example of a person learning tennis. The teacher takes the student by the hand and shows basic movements: This is a forehand, this is a backhand, this is a serve. Still, it will take hours and hours of training before the student even feels comfortable at performing these behaviors. Even more practice is needed for the student to be able to play an actual game with these elementary behaviors." But still, he adds, humans succeed at learning the task. Why can't robots do the same? "That's what we're trying to do: Make our robots mimic the way humans learn new behaviors."

In the first part of the video, graduate student Katharina Muelling shows the robot how to paddle a ball on a string by performing the action while holding the robot's "hand." The robot decomposes the movement into primitive motor behaviors -- a discrete motor primitive that modulates the rhythmic paddling with an increasing amplitude until it becomes a stable rhythmic behavior -- and quickly "learns" how to perform the task.

For comparison purposes, the researchers tried to manually program the robot's motors to perform the same task. It took them three months and the result wasn't as good as the imitation learning experiment, which took less than an hour, Dr. Peters says. 

In the second part of the video, Muelling teaches the robot the ball-in-a-cup game. [See photo on the right; the robot has to swing the yellow ball, which is attached to a string, and make it land into the blue cup.] This skill is significantly more difficult than paddling the ball on a string, and the robot doesn't have enough data to simply imitate what the human did. In fact, when the robot attempts to reproduce the human action, it can't match the accelerations of the human hand and the ball misses the cup by a large margin. Here, self-improvement becomes key, Dr. Peters says.

"For every new attempt, when the robot reduces the distance by which the ball misses the cup, the robot receives a 'reward,' " he says. "The robot subsequently self-improves on a trial-by-trial basis. It usually gets the ball in the cup for the first time after 40 to 45 trials and it succeeds all the time after about 90 to 95 trials."

How does the robot's learning ability compare to a human being? PhD student
Jens Kober, who led this particular experiment, wanted to find out: He went home for a holiday last year and enjoyed the benefit of an extended, large family -- always good subjects for a scientific experiment. He showed his many cousins the ball-in-a-cup game and rewarded them with chocolate. It turned out that the younger ones (around 6 years old) would not learn the behavior at all, the ones in their early teens (10 to 12) would learn it within 30 to 35 trials, and the grownups would be much faster.

"His supervisor may be the only person in his lab who has not managed to learn this task," Dr. Peters quips.

In the last part of the video, the researchers tackle an ever harder task: ping pong. Again, Muelling teaches the robot by holding its "hand," this time to hit a ping pong ball sent by a ball gun [photos above]. The challenge here is to use -- and modify -- previously learned basic motions and combine them with visual stimuli: The robot needs to keep track of the ball, which may come from different directions, and then execute the right set of motions.

Some of their work, part of GeRT consortium, a program that aims at generalizing robot manipulation tasks, is still preliminary, Dr. Peters notes. But he's confident they can teach their robot to become a good ping pong player. How good? Maybe not as good as Forrest Gump, but good enough to beat everyone in the lab.

Samuel Bouchard is a co-founder of Robotiq in Quebec City.

WheeMe Massage Robot Roams Around Your Back

dreambots wheeme

There are several therapeutic robots out there, but this one is a bit different. While robots like Paro the baby seal require you to stroke them, the DreamBots Wheeme caresses you.

According to the company, this massage robot uses "unique tilt sensor technology" to move slowly across a person's body "without falling off or losing its grip." As the bot roams around, its four sprocket-like rubber wheels press gently on the skin.

Founded by a bunch of Israeli electronics and defense engineers, DreamBots will show off the WheeMe at CES next January. There's no word on price yet. The company admits the robot can't give you a deep tissue massage, because it's very light (240 grams, or 8.5 ounces). But they claim the device can provide "a delightful sense of bodily pleasure." 

It's unclear how big the market is for a body-rolling robot. I guess we'll have to wait and see.

In the meantime, watch the WheeMe navigate:

Another video and more images:

dreambots wheeme

dreambots wheeme

dreambots wheeme


Robots With Knives: A Study of Injury
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

Therapeutic Robots Are Cute But Costly
Fri, July 31, 2009

Blog Post: These 'bots are soft, cuddly, and designed to be universally likable, but they don't come cheap.

Robotic Bed for Sleep Research
Mon, September 27, 2010

Blog Post: Swiss engineers have used cables, pulleys, and motors to build a robotic bed.

French Robot to Assist Elderly, Disabled
Tue, March 09, 2010

Blog Post: A French robotics company has unveiled a robot designed to assist elderly and disabled people

Falling Robot Gecko Rights Itself in Mid-Air

We've seen robot geckos climbing walls before. Now researchers are adding a twist -- literally. If this bio-inspired bot falls, rather than crashing into pieces, it can right itself mid-air and land on its feet.

The UC Berkeley researchers, led by graduate student Ardian Jusufi, describe their results in a paper published today in a special edition of the Institute of Physics's Bioinspiration & Biomimetics. The movie shows how a gecko uses its tail to right and turn itself mid-air and fall on its feet. The researchers studied how a real gecko does the trick, modeled the maneuver on a computer, and built a robot gecko that can do the same.

We're seeing more and more bio-inspired robots lately, and that looks like a promising trend in robotics.

"Because biologists and engineers are typically trained quite differently, there is a gap between the understanding of natural flight of biologists and the engineer's expertise in designing vehicles that function well," the special edition's editor David Lentink from Wageningen University, in the Netherlands, writes in an accompanying editorial. "In the middle however is a few pioneering engineers who are able to bridge both fields."

Other articles describe how scientists are trying to mimic the natural abilities of humming birds, cruising seagulls, flapping insects, and floating maple seeds to improve the design of air vehicles.

But the robot I really want to see? The amazing gliding snake!

See more videos below, including one from Jake Socha and his team at Virginia Tech showing the mystifying skills of flying snakes, which direct their flight mid-air by slithering.

PR2 Gets Awesome Kinect Teleoperation System

This is just the first taste of what a hacked-up Kinect sensor is capable of… That motion capture and teleoperation system looks pretty sweet, and as Willow Garage says, they’ve basically just started messing with the capabilities of the sensor, and things are already progressing very quickly.

Kinect is $150, and the open source drivers are free. Go crazy.

[ Willow Garage ]

Hacked Kinect Runs on iRobot Create

If you can’t wait for a hacked Neato LIDAR system and you need some cheap localization and mapping hardware, you might want to take a good look at Microsoft’s Kinect system, which has already been hacked open and made available to anyone using ROS.

MIT’s Personal Robotics Group has put together the demo in the vidbelow , which shows an iRobot Create plus a Kinect sensor performing 3D SLAM (simultaneous localization and mapping) and also reacting to gesture inputs from a human, which is pretty cool. Most of the heavy lifting is done by an offboard computer, but there’s no reason that the whole system couldn’t be easily integrated into the robot itself, since I think I remember hearing that Kinect is minimally intensive when it comes to processing requirements.

This kind of thing is really, really fantastic because we’re starting to see high quality sensing systems that provide awesome data being available for what’s basically dirt cheap. Remember those DARPA Grand Challenge cars and their hundreds of thousands of dollars of ranging sensors? It was only a few years ago that 3D sensing hardware was totally, completely out of range for hobby robotics, and now, in the space of like 6 months, we’ve actually got options. Yeah, it’s piggybacking off of other tech, but there’s nothing wrong with that, and it’s only going to get better as the gaming and automotive industry invest more resources in making their machines smarter, not just faster.

Thanks Philipp!

Obama Meets Japanese Robots

obama hrp-4c humanoid robot
President Obama meets HRP-4C, created by a team led by Dr. Kazuhito Yokoi [right] at AIST.

At the Asia-Pacific Economic Cooperation summit, which this year took place in Yokohama, Japan, U.S. President Barack Obama met not only diplomats, heads of state, and world leaders -- he also met several robots.

Before walking into a meeting, Obama was greeted by HRP-4C, the female humanoid robot created at Japan’s National Institute of Advanced Industrial Science and Technology, or AIST. The android talked and gestured enthusiastically, but this time it didn't show off its dance moves.

Next, the robotic baby seal Paro, also invented at AIST, made an appearance. Dr. Takanori Shibata, the creator of Paro, told the president that the robot is used as a therapeutical device in hospitals, and Obama gave the squealing furry creature a good caress.

obama paro therapeutic robot baby seal
Obama pets Paro the robot seal, created by Dr. Takanori Shibata [right] from AIST.

Then it was time for a ride on what appears to be the latest version of Toyota's i-REAL personal mobility vehicle. Well, it wasn't much of a ride. Obama drove an inch forward, but when the machine suddenly tilted back the president almost jumped out of it. "That's what we're going to be driving," he quipped.

Watch Obama meeting the Japanese robots:

Obama is not the first U.S. president to meet a humanoid robot. In 2005, Albert HUBO, the Korean humanoid that features an Albert Einstein headshook hands with President George W. Bush, who seemed fearless despite the robot's odd looks and a previous incident with robotic technologies. 

obama toyota i-real
Obama sits on Toyota's i-REAL, a futuristic personal mobility vehicle.

Here's a longer video showing various heads of state -- Medvedev of Russia, Hu Jintao of China, Chile's Sebastián Piñera (famous after the trapped miners incident), among others -- interacting with the Japanese technologies:

Images: APEC; video: NECN


Honda's U3-X Unicycle of the Future
Mon, April 12, 2010

Blog Post: It only has one wheel, but Honda's futuristic personal mobility device is no pedal-pusher

Meet Geminoid F, a Smiling Android
Sat, April 03, 2010

Blog Post: Geminoid F, a copy of a woman in her twenties, can smile, frown, and change facial expressions

How to Make a Humanoid Robot Dance
Tue, November 02, 2010

Blog Post: Japanese roboticists have showed off a female android dancing with a troupe of human performers

The Invasion of Cute, Therapeutic Robots
Fri, July 31, 2009

Blog Post: These robots are soft, cuddly, and designed to be universally likable, but they aren't cheap

Geminoid F Gets Job as Robot Actress

More news about Geminoid F, the ultrarealistic android unveiled early this year: the robot got a job.

Geminoid F is working as an actress, taking the stage in a play that opened yesterday in a Tokyo theater.

In the 20-minute play, titled "Sayonara" ("good bye" in Japanese), the android shares the stage with another actress (of the human kind) named Bryerly Long. Long plays the role of a young woman who is suffering from a fatal illness and whose parents bring her an android to serve as a companion.

A human operator controls the robot from a soundproof chamber behind the stage. A microphone captures the operator's voice and cameras track head and face movements. When the operator speaks or moves, the android follows suit.

The robot is in a permanent sitting posture, so movements are limited to the head, torso, and arms. The performance is "a bit mechanical," as Reuters puts it, but that doesn't seem to be a problem: the android is playing the role of an android after all. 

Geminoid F is a creation of Hiroshi Ishiguro, a professor at Osaka University and researcher at ATR Intelligent Robotics and Communication Laboratories.

The "Android-Human Theater" project is a collaboration between Ishiguro and Japanese director Oriza Hirata, who writes and directs.

According to Ishiguro, the play explores the question, "What do life and death mean to humans and robots?," and it will "alter the audience's images of robots and humans, and present a compelling fusion of theater arts and science."

Would you go watch the play?

Video and photos: ITN


Female Android Geminoid F Unveiled 
Sat, April 03, 2010

Blog Post: Geminoid F, an android copy of a woman in her 20s, can talk, gesture, and smile

Geminoid F Looks Even More Realistic
Mon, November 01, 2010

Blog Post: The female android features facial movements even more realistic than before 

Geminoid F: More Video and Photos 
Tue, April 20, 2010

Blog Post: IEEE Spectrum obtained exclusive images and video of Hiroshi Ishiguro's new android

The Man Who Made a Copy of Himself
April 2010

Article: Japanese roboticist Hiroshi Ishiguro is building androids to understand humans--starting with himself


Smart Systems Help Elderly Stay Healthy, Independent

The Quality of Life Technology Center, in Pittsburgh, is developing intelligent systems to help older adults stay independent and healthy.

This slide show is part of our special report "Robots for Real."

Embed this audio slide show on your site: 

Rat Brain Robot Grows Up

Kevin Warwick is most certainly the preeminent cyborg of our time. More than a decade ago he implanted an RFID chip in himself to control simple functions like turning on the lights, and it's been 8 years since he inserted a more elaborate, 100-electrode array into the nerves in his forearm that allowed him to manipulate a robotic arm on another continent. He's assisted students at the University of Reading, in England, who wished to implant magnets in the tips of their fingers and at least one who wished for an electrode in the tongue (with the help, Warwick says, of a Manchester tattoo artist who goes by the name "Dr. Evil").

More recently, he's been growing rat neurons on a 128-electrode array and using them to control a simple robot consisting of two wheels with a sonar sensor. The rudimentary little toy has no microprocessor of its own -- it depends entirely on a rat embryo's brain cells. The interesting question is just how big one of these neuron-electrode hybrid brains can grow, and those brain cell networks are now getting more complicated, and more legitimately mammalian, Warwick said this week in a keynote speech at the IEEE Biomedical Circuits and Systems conference. Warwick's twist predates the living rat-controlled robot we wrote about recently, and it just goes to show that weird cyborg animal projects have virtually unlimited potential.

To start off a rat brain robot, embryonic neurons are separated out and allowed to grow on an electrode array. Within minutes the neurons start to push out tentacles and link up to each other, becoming interconnected dendrites and axons. A dense mesh of about 100,000 neurons can grow within several days. After about a week, Warwick and his collaborators can start to pulse the electrodes under the neural mesh in search of a pathway -- that is, when neurons near an active electrode fire, another group of neurons on a different side of the array shows an inclination to fire as well.

Once they have a pathway -- the groups fire in tandem at least a third of the time -- the University of Reading researchers can use that connection to get the robot to roam around and learn to avoid crashing into walls. They connect the electrode array to the robot using Bluetooth. When the sonar senses it's nearing a wall, it stimulates the electrode at one end of the neural pathway, and at first the brain sends back a coherent response only every once in awhile. The robot interprets the response as an instruction to turn its wheels. With time and repetition, the neural pathways become stronger, and the robot runs into the walls less frequently. In effect, the robot works out for itself how to not bash into obstacles.

To add complexity to the experiments, Warwick's lab is now collaborating with a Canadian group to culture neurons in three dimensions, meaning they are attempting to grow a network of 30 million neurons -- a big step towards the 100 billion found in a human brain. After that, the next step will be to bring in human neurons. "If we have 100 billion human neurons," Warwick says, "should we give it rights? Does it get to vote?" More to the point, he wonders: "Is it conscious?"

Read also:

Cyborg Fly Pilots Robot
Thu, August 26, 2010

Blog Post: Swiss researchers have made a fruit fly steer a mobile robot in the lab

Man Replaces Eye with Bionic Camera
Fri, June 11, 2010

Blog Post: Canadian filmmaker Rob "Eyeborg" Spence has replaced his false eye with a bionic camera eye

Monkey Controls Robot with Mind
Wed, June 02, 2010

Blog Post: A monkey with a brain-machine interface commands a 7-degree-of-freedom robotic arm

The Robot Baby Reality Matrix
July 2010

Article: Some robot babies look real. Some act real. A few do both

How to Make a Humanoid Robot Dance

Japanese roboticists recently showed off a female android singing and dancing along with a troupe of human performers. Video of the entertaining and surprisingly realistic demonstration received went viral on the Net.

How did they do it?

To find out, I spoke to Dr. Kazuhito Yokoi, leader of the Humanoid Research Group at Japan’s National Institute of Advanced Industrial Science and Technology, known as AIST.

The secret behind the dance routine, Dr. Yokoi tells me, is not the hardware -- it's software.

The hardware, of course, plays a key role. The AIST humanoids group is one of the world’s top places for robot design. Their HRP-2 humanoids are widely used in research. And the group's latest humanoids, the HRP-4 and a female variant, the HRP-4C, which is the robot in the dance demo, are even more impressive.

But now the biggest innovation is a new software to program the movements of the robot. The software is similar to those popularly used in CG character animation. You basically click on the legs, arms, head, or torso and drag them to the position you want. You create a sequence of key poses and the software generates the trajectories and low-level control to make the robot move.

So by editing a relatively small number of key poses you can compose complex whole-body motion trajectories. See a screen shot of the software interface below, with a 6.7-second sequence that uses only eight key poses:

The software developed at AIST to create sequences of movements.

The software verifies that the robot can indeed perform the transitions from one pose to the next. If the angular velocity or range of one of the joints exceeds the maximum values, the software adjusts the pose, so that it's feasible to execute.

The software also monitors the robot’s stability. When it generates a trajectory between two key poses, it checks that the waist trajectory won't create instabilities and that foot trajectories will result in enough contact with the floor. If a pose is not safe, the software finds a similar pose that would keep the robot in balance.

After creating a sequence, the user can preview the resulting motion on the 3D simulator—or, if you have an HRP-4C you can upload the code to the robot and watch it dance.

Here's a video showing how the software works:

Dr. Yokoi and colleagues Shin’ichiro Nakaoka and Shuuji Kajita describe the software in a paper titled "Intuitive and Flexible User Interface for Creating Whole Body Motions of Biped Humanoid Robots," presented at last month's IEEE/RSJ International Conference on Intelligent Robots and Systems.

One of their goals in developing the software, Dr. Yokoi says, is simplifying the creation of robot motion routines, so that even non-roboticists can do it. "We want other peoplelike CG creators, choreographers, anyoneto be able to create robot motions," he adds.

Here’s my full interview with Dr. Yokoi, in which he describes how the new software works, what it took to create the dance routine, and why he thinks Apple's business models could help robotics.

Erico Guizzo: I watched the video of the HRP-4C dancing with the human dancers several times—it’s fascinating. How did you have the idea for this demonstration?

Kazuhito Yokoi: We wanted to prepare a demonstration for this year’s Digital Content Expo, in Tokyo, and one of our colleagues, Dr. [Masaru] Ishikawa from the University of Tokyo, suggested this kind of event. At last year’s Expo, we used the robot as an actress. We didn’t have the software to create complex motions, so we were limited to movements of the arms and face. It was a fun presentation. But this time we wanted to do something different, and one of the ideas we had was a dance performance. One of the key collaborators was SAM, who is a famous dancer and dance choreographer in Japan. He created our dance routine. The human dancers are members of his dance school.

EG: Did he choreograph the robot’s dance movements as well?

KY: We wanted to make the dance as realistic as possible. So we didn’t choreograph the robot first. Instead, SAM created a dance using one of his students. Then we used the software to “copy” the dance from the human to the robot.

HRP-4C performs with human dancers.

EG: How long did this process take?

KY: Programming the software is relatively fast. But because this was a complex performance, we did several rehearsals. After SAM created the dance and we transferred it to the robot, he watched the robot and wanted to make some adjustments to the choreography. We expected that would happen because, of course, there are differences between the abilities of a human and a humanoid. For example, the joint angle and speed have maximum values. So it’s difficult to copy the dance exactly, but we tried to copy as close as possible. Then we transferred SAM's changes to the robot and we did another rehearsal. And at some point we also brought in the human dancers. I think we spent about one month until we had the final performance.

EG: When you’re using the software, what if you program a movement that the robot can’t execute, either because of angle or speed limitations or because it would fall?

KY: What you give the software are key poses. If, for example, you have one pose and you create a new pose and making that transition would require a joint angular velocity higher than what the robot can perform, then the software would inform you about that, and you can adjust the pose, reducing the final angle of the joint. The software also automatically keeps track of stability. Of course, users should have some basic understanding of their robot, how it balances, but the software does the rest—it will alert the user if a pose is unstable and correct the pose.

EG: Does the software compute ZMP [Zero Moment Point] to detect poses that are unstable?

KY: Yes, we use the ZMP concept. Again, the user can freely design the key poses. If a pose is not stable, the software automatically detects that the pose is a problem and modifies it. So it’s doing that in real time, as you design your sequence of movements. And if you don’t like the “corrected” pose you can choose another pose and keep trying until you’re satisfied with the movements. And of course, you can try your whole choreography using the software, before you test it in the real robot!

The software automatically adds a key pose needed to maintain stability.

EG: Was the software designed specifically for HRP-4C?

KY: No. The software is robot independent. You just need the robot model. For example, we have the model for HRP-2, so we can create HRP-2 movements. We also have the model for HRP-4, and we recently created movements for this robot as well.

[See below a recent video of HRP4.]

EG: Speaking of HRP-4, is HRP-4 and HRP-4C the same robot with just different exteriors? And are they both made by Kawada Industries?

KY: They are not the same. HRP-4C has 8 actuators in its head and it can make facial expressions. HRP-4 has no such kind of actuators. HRP-4 is made by Kawada. HRP-4C is special. It’s a collaboration. At AIST we designed the robot, but we have no factory to make robot hardware, so we collaborated with [Japanese robotics firms] Kawada and Kokoro. Kawada makes the body and Kokoro the head. You may know the Geminoid created by Professor [Hiroshi] Ishiguro of Osaka University. He's made several androids. His androids are made by Kokoro. So we also asked them to develop our robot head for HRP-4C. They have very good know-how to make humanlike skin. That’s an important factor.

EG: Can you use the software to design other kinds of movements, such as tasks to help a person in a house?

KY: Yes. That’s our dream. We need more capabilities to do that, like recognizing a person and objects in the house, for example. That’s not part of this software. But this software lets you program any kind of movement. And we want more people to try to program the robot. Now only researchers can do that. But in our opinion that’s not good enough. We want other peoplelike CG creators, choreographers, anyoneto be able to create robot motions. And maybe that will lead to robotics applications not only in entertainment, but in industry and home applications too. Think about the iPhone. Many people want an iPhone because it has hundreds of nice software applications. Apple didn’t create all of those; they were developed by others, including some small developers, and they were able to have great success. So the iPhone is a platform—video game consoles and computers are similar in that sense—and we want to follow this business model.

EG: When will researchers and others be able to use the software?

KY: We just finished developing the software and we’ve not delivered it to anybody. We have not yet decided what kind of license we will adopt, but we have plans to make it available maybe by the end of next March.

EG: What about the HRP-4 and HRP-4C robots—who will be able to use them?

KY: If you buy one, you can use it. [Laughs.]

EG: So what is the goal of your group at AIST? Do you want to create humanoid robots to help other researchers who study robotics or do you want to develop robots that one day will actually be used in people’s homes and factories?

KY: Humanoid robots in homes and factories, as you mentioned, that’s our final goal. That’s our long, long final goal. But in the mean time, we think we can contribute to other application areas in humanoid robotics. One is the hobby and toy humanoid robotsit's a big area. The second consists of research platforms, like HRP-2 or HRP-4, that people in academia can use to develop new software or theories on how to control robots and how to make them perform tasks naturally. The third area is entertainment. That’s why we created the dance performance. We have also shown the HRP-4C wearing a wedding dress at a fashion show. Or used it as a master of ceremony. But our final goal is not just entertainment. For example this new software can make any kind of motion. Maybe we could use it to make the robot perform tasks to help elderly people, or to perform activities involving education or communication. There are many possibilities.

EG: AIST’s humanoids are among the most impressive. Where do you get inspiration for creating them? And do you always want to make them look more human or is it sometimes a good idea to make them look robotic?

KY: Good questions. I think it depends on the application. HRP-2, HRP-3, and HRP-4 look robotic. If a robot is just walking or doing some dangerous, dirty, or dull task, okay, it doesn’t need a human face. But if we want to bring our robots to the entertainment industry, for example, then a more humanlike appearance is more different and maybe more attractive. That’s why we created a female humanoid. When we decided to bring our humanoids into the entertainment industry, we thought that a female type would be better.

AIST's humanoids HRP-2, 3, and 4. Photo: Impress/PC Watch

EG: Going back to the beginning of our conversation, about the HRP-4C dance, a lot of people have seen the videowhy do you think people are so fascinated with this demonstration?

KY: I don’t know. I guess this was a large trial in humanoid robotics. Dancing is something very human. You don't expect to see robots dancing like that with other dancers. Maybe people have seen smaller robots dancing, like Sony's QRIO or the Nao humanoid robot from Aldebaran. But for these small types of robot it’s difficult to collaborate or interact with humans. In our demonstration we wanted to show a realistic dance performance. And of course, we wanted it to be fun!

Images and videos: AIST

This interview has been condensed and edited.


Obama Meets Japanese Robots
Mon, November 15, 2010

Blog Post: The president was greeted by humanoid HRP-4C and caressed Paro the robotic seal

Honda's U3-X Unicycle of the Future
Mon, April 12, 2010

Blog Post: It only has one wheel, but Honda's futuristic personal mobility device is no pedal-pusher

Meet Geminoid F, a Smiling Android
Sat, April 03, 2010

Blog Post: Geminoid F, a copy of a woman in her twenties, can smile, frown, and change facial expressions

The Invasion of Cute, Therapeutic Robots
Fri, July 31, 2009

Blog Post: These robots are soft, cuddly, and designed to be universally likable, but they aren't cheap



IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:

Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
Jason Falconer
Angelica Lim
Tokyo, Japan

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Load More