Automaton iconAutomaton

Personal Mobility Robot Operated by Wii Controller

The Personal Mobility Robot, or PMR, is a nimble robotic wheelchair that self-balances on two wheels like a Segway. The machine, based on a platform developed by Toyota, has a manual controller that the rider uses to change speed and direction [see photo, inset].

Now two University of Tokyo researchers have decided to upgrade the machine, making it controllable by a Wii remote controller. Why drive the thing with a Wii-mote? Well, why not?

Last year, when I visited the JSK Robotics Laboratory, part of the university's department of mechano-informatics and directed by Professor Masayuki Inaba, researchers Naotaka Hatao and Ryo Hanai showed me the PMR under Wii control. They didn't let me ride it while they're piloting the machine, but it was fun to watch.

Watch:

Watch in HD here.

The PMR project is part of the Information and Robot Technology (IRT) research initiative at the University of Tokyo. Researchers developed the machine to help elderly and people with disabilities remain independent and mobile. The machine is designed to be reliable and easy to operate, being capable of negotiating indoor and outdoor environments -- even slopes and uneven surfaces. It weighs 150 kilograms and can move at up to 6 kilometers per hour.

The PMR is a type of robot known as a "two-wheeled inverted pendulum mobile robot" -- like the Segway and many others. The advantage of a self-balancing two-wheeled machine is its smaller footprint (compared to, say, a four-wheeled one) and its ability to turn around its axis, which is convenient in tight spaces.

The machine assumes a lower configuration to allow a rider to climb on the seat. Then it raises itself, allowing the two wheels to dynamically balance the vehicle. (The two little wheels you see at the front and at the back are for safety, in case the machine tips over.)

In addition to Wii-mote controllability, the JSK researchers have been working on an advanced navigation system that is able to localize itself and plan trajectories, with the rider using a computer screen to tell the robot where to go [photo, right].

The navigation system runs on two laptop computers in real time, one for localization and the other for trajectory planning. Laser range sensors and SLAM algorithms detect people and objects nearby, and it can distinguish between static and moving obstacles. It does that by successively scanning its surroundings and comparing the scans, which allows it to detect elements that are moving as well as occluded areas that only become "visible" as the robot moves.

The system can rapidly detect pedestrians who suddenly start to move as well as people appearing from blind spots. In those cases, the robot can do two things: recompute the trajectory to avoid a collision [image below, left, from a paper they presented at last year's IEEE RO-MAN conference] or stop for a few seconds, wait for the pedestrian, and then start moving again [below, right].

The navigation system uses a deterministic approach to plan the trajectory. Basically it assigns circles to all vertexes of static objects and then tries to draw a continuous line that is tangent to the circles, going from origin to destination. Of course, there might be a lot of possible routes, so the system uses a A* algorithm to determine the path to be taken. You can see a visual representation of this approach below:

And although I didn't get to see it, researchers told me they're also developing a PMR model specific for indoor use. It's lighter (45 kg) and more compact and the rider can control it by shifting his or her body, just like the Segway.

That means, the researchers say, that you can ride it hands-free: Just tell it where to go and enjoy the ride while sipping a drink or reading a book.

Photos: Information and Robot Technology/University of Tokyo, JSK Robotics Lab

How Recycling Robots Could Help Us Clean the Planet


Dustbot, a garbage-collecting robot created by the Scuola Superiore Sant'Anna's CRIM Lab.
Photo: Massimo Brega

At the current rate of global population growth and consumption of resources, it appears clear to me where we're going to end: in a waste-covered Earth like that depicted in the movie WALL-E.

Needless to say recycling is one of the most important things we can do to keep our planet sustainable. I think it won't be long until governments all over the world create all kinds of incentives to improve recycling.

Which brings us to ... robots!

Recycling is a very promising area for robotics. Over the next few decades I imagine a future where waste-collecting robots will be moving through air, land, and water, reaching difficult areas to help us cleaning our environment. Picture WALL-E but before the whole planet becomes a landfill.

In fact, there are already some recycling bot prototypes roaming around. One example is Dustbot, a robot developed at the Scuola Superiore Sant'Anna's CRIM Lab, in Pisa, Italy. Led by Prof. Paolo Dario, the laboratory created a robot designed specifically to collect garbage at people's homes.

It's 1.5 meter tall, weighs 70 kilograms and can carry 80 liters or 30 kg of payload. The robot can travel at 1 meter per second and its battery gives it 16 kilometers of autonomy.


Photo: Massimo Brega

Accordingly to this BBC story the Dustbot can be summoned to your address through a mobile phone at any time of the day. Basically the machine -- built using a Segway Robot Mobility Platform -- uses a GPS system and motion sensors to drive around the city and show up at your doorstep.

Once it arrives, the user just selects the type of garbage he wants to dispose using a touch screen. A compartment opens on the robot's belly where the user places the garbage, which is them transported to a drop-off location.

The robot's greatest advantage is its size: it can navigate through narrow streets and alleys where normal garbage trucks can't go.

Here's a video showing how Dustbot -- and its "siblings" DustCart and DustClean robots -- work:


Another example is Push, a robot that patrols the streets of Disney World, asking people to feed it with rubbish. Well, it's not exactly a robot -- it's a remote-controlled garbage can. An operator drives it through the crowd, using a speaker system to talk to people, persuading them to recycle their garbage.

Watch it in action in the video below.


It's not WALL-E, but it's funny and efficient, and if it could be made truly autonomous, this simple robot -- along with an army of Dustbots and similar machines -- would be a powerful way of keeping the streets, and hopefully the planet, a bit cleaner.

Do you know of other recycling robots? Let us know.

UPDATED 04/22/10: Dustbot specs added.

Geminoid F: More Video and Photos of the Female Android

geminoid f
Photos: Osaka University (left); Osaka University and Kokoro Company (right); composite (middle).

Geminoid F, the female android recently unveiled by Hiroshi Ishiguro, a roboticist at Osaka University and ATR famous for his ultra-realistic humanlike androids, generated a lot of interest. Several people wrote me asking for more details and also more images. So here's some good news. I got some exclusive photos and video of Geminoid F, courtesy of Osaka University, ATR Intelligent Robotics and Communication Laboratories, and Kokoro Company. Below is a video I put together giving an overview of the project.


Watch in HD here.

And here are some more photos of the android. The first one below is a composite I created using the two photos right beneath it. It shows how the android's silicone body hides all the mechanical and electronics parts.

geminoid f
Composite based on photos below. Notice that the robot's body is not in the exact same position in the two images, so the composite is not a perfect match; also, I had to flip the robot skeleton image to get the right angle, creating a mirrored image that obviously doesn't correspond to reality.

geminoid f
Photos: Osaka University and Kokoro Company; Osaka University

Here's a Kokoro engineer working on the android's face. Ishiguro and Kokoro have long been collaborators, creating several humanlike androids that include the Geminoid HI-1 and Repliee Q1 and Q2.

geminoid f
Photo: Osaka University and Kokoro Company

In developing Geminoid F, Ishiguro paid particular attention to the facial expressions. He wanted an android that could exhibit a natural smile -- and also a frown.

geminoid f
Photos: Osaka University

The android is a copy of a woman in her twenties. Ishiguro told me that her identity will remain "confidential."


Photo: Osaka University

geminoid f
Photo: Osaka University

Here's Geminoid F meeting Geminoid HI-1.

geminoid f
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

geminoid f and geminoid hr-1
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

This one below shows the woman teleoperating the android. A vision system captures her mouth and head movements, reproducing those movements on the android. The woman can also use the mouse to activate certain behaviors.

geminoid f
Photo: Osaka University

So tell us: Was Ishiguro able to leap over the abyss of the uncanny valley?

Bandit, Little Dog, and More: University of Southern California Shows Off Its Robots


Bandit, a caregiving humanoid robot developed at USC's Interaction Lab

Last Thursday, I headed out to the University of Southern California campus in Los Angeles for an open house at the Center for Robotics and Embedded Systems (CRES). It was a great opportunity to see some amazing research on humanoid robots, robots learning from humans, machine learning, and biologically inspired robots. Some highlights:

Let's start at the Interaction Lab led by Dr. Maja J. Mataric, a professor of computer science, neuroscience, and pediatrics and director of CRES. Her lab focuses on human-robot interaction, specifically with the goal of developing "socially assistive systems" to help in convalescence, rehabilitation, training, education, and emergency response. (Spectrum recently ran a profile of Mataric, read here.) 

Ross Mead, a graduate student in Mataric's group, is currently working with children with autism through USC's Center for Autism Research in Engineering (CARE). Children with autism tend to interact more easily with robots than with humans. So Dr. Mataric’s group has been exploring the use of socially assistive robots in conjunction with speech processing technology to help improve social communication skills of the children.

Image courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Current results have shown improved speech and interaction skills in autistic children when presented with robots, such as their caregiving robot named Bandit. It has 6-DOF arms and a head than can pan and tilt, with a face with movable mouth and eyebrows, and stereo-cameras for eyes.

In another application, Bandit serves as a social and cognitive aid for the elderly. It will not only instruct the user to perform certain movements, but also motivate the person and ensure that each movement is performed correctly.

Below is a video of Bandit showing off USC colors and interacting with graduate student Juan Fasola (and here's a video with an overview of the project).


Video courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Another student at the Interaction Lab, Ross Mead is studying what aspects of robotic design create a more humanlike appearance and that improve acceptance of robots by humans. This has involved Sparky (below), a “minimatronic figure” developed by Walt Disney Imagineering Research and Development. The robot has 18 degrees of freedom and uses small servos and tendon-driven mechanisms to reproduce humanlike motions.

One possible application for Sparky will be as a lab tour guide. Equipped with a mobile base, it should be able to stop at various parts of the lab and describe using speech and gestures the various projects.

Watch the video below to see how Sparky uses its tendons and a spring as a spine to try to achieve natural movements:



Next up is the Computational Learning and Motor Control Lab headed by Dr. Stefan Schaal, a professor of computer science and neuroscience.

As part of the DARPA Learning Locomotion program, Schaal and his colleagues are investigating legged locomotion with the quadruped robot Little Dog developed by Boston Dynamics, whose other robots include the also quadruped Big Dog, the LS3 robot mule, and biped bot PETMAN.

Legged robots have the potential to navigate more diverse and more complex terrain than wheel-based robots, but current control algorithms hinder their application. So Schaal’s group is using Little Dog as a platform for learning locomotion in which learning algorithms developed with Little Dog will enable robots to transverse large, irregular and unexpected obstacles.

I had the opportunity to speak with Dr. Jonas Buchli and Peter Pastor of Dr. Schaal’s group following a demonstration of Little Dog. They discussed potential applications that include survivor location and recovery after a disaster, prosthetic limbs, and space exploration.

Watch the video below to see Little Dog in action (and watch this other video to see the little bot performing even more maneuvers).


Finally, at USC's iLab, Dr. Laurent Itti, a professor of computer science, is investigating how to make robots interact more naturally with humans and more effectively integrate into our lives. For that to happen, it will be important to create robots with humanlike qualities. In other words, robots will have to demonstrate humanlike locomotion, facial expressions, and eye movement. In addition, as robots gradually leave controlled environments, such as factory floors, and enter environments populated by humans, they’ll need enhanced cognitive abilities that enable them to autonomously navigate in an unstructured environment. One way of achieving that is by looking at biology.

One of the lines of research Itti and his students are pursuing involves monitoring the gaze of human participants as they watch a movie or play a video game. Such research will provide a window into how the brain functions as well as how it may become altered in diseased states. Furthermore, insights into brain function gleaned from the research has applications in machine vision, image processing, robotics, and artificial intelligence. Dr. Itti is also investigating the application of biologically inspired visual models for automatic target detection in a cluttered environment, driver alert monitoring, autonomous robotic navigation, and video games.

His group launched the Beobot 2.0 project to create an integrated and embodied artificial intelligence system and, through providing open access to their hardware and software design, enable other research groups to build other robots with diverse capabilities. Below is a picture of Beobot 2.0, and you can watch a video here to see it navigating a corridor.

beobot ilab usc
Image courtesy of Dr. Laurent Itti and USC's iLab

With the expected increase in the robot population over the next decades, robots will emerge as a prevalent force in our lives and will permeate environments beyond manufacturing and include everything from healthcare and emergency response to personal entertainment and services. While providing many benefits, robots will become part of society, raising new and unforeseen social and ethical questions that will, in effect, give us a better understanding of ourselves and what it means to be human.

In the meantime, what's my Roomba doing?

Daniel Garcia is an intern at Lux Capital and is interested in clean technology and innovations in healthcare. He holds a PhD in biomedical engineering from UCLA.

Dispatches from FIRST Robotics Championship

For those of us who couldn't make it to the FIRST finals in Atlanta this week, National Instruments is posting some cool videos straight from the competition floor. Todd, a NI staff engineer, is doing "man on the street" interviews with the teams, talking about their robots, strategies, and ... costumes. Check out his interview with the Team FTC 7 Stormtrooper guy above. NI will be posting more videos tomorrow and Saturday.

NI is also announcing today that registrations are open for the Moonbots: Google Lunar X Prize LEGO Mindstorms Challenge. From the release:

The MoonBots Challenge is an exciting contest that challenges teams of kids (age 13 and up) and adult mentors to learn about robotics, the Moon, and space exploration by designing and constructing a LEGO MINDSTORMS robot that performs simulated lunar missions similar to those required to win the $30 million Google Lunar X PRIZE. Six-member teams are engaged in hands-on learning experiences, helping to inspire today’s kids to become tomorrow’s innovators and creative problem solvers who explore futures in science and engineering. The MoonBots Challenge is free and open to teams across the globe.

For more information, go to http://www.moonbots.org and watch the video below, an interview with Steven Canvin from LEGO about the competition:

NASA's Robonaut-2 Will Go to Space This Year

NASA's Robonaut-2 (R2), a semi-humanoid robot co-developed with GM, will rocket to space on the shuttle Discovery later this year as part of NASA's final space shuttle mission. It will be the first human-like robot NASA has sent to space.

R2's dexterous hands give it the strength and flexibility to manipulate tools just like humans do, making it an ideal helper for humans in space.

Once it gets to space, R2 will be confined to the inside of the International Space Station (ISS) while astronauts test its ability to operate in zero-g. It may eventually get space-certified like its non-humanoid relative, Dextre, a two-armed dexterous manipulator developed by the Canadian Space Agency. Dextre currently assists in tasks outside the space station.

NASA engineers have less than 6 months to get R2 ready for flight, including vibration, vacuum, and radiation tests. Watch this video to see how they'll do it. 

R2 will launch on STS-133, scheduled for September, and will remain on the ISS.

Photo and video courtesy of NASA.

It's National Robotics Week! Do You Know Where Your Robot Is?

On March 9th, House Resolution 1055 -- introduced by Pennsylvania representative Mike Doyle -- passed in the House, designating the week of April 10-18 as National Robotics Week. The Congressional Robotics Caucus, with some lobbying from iRobot Corporation and other organizations, introduced the resolution to promote activities that help raise awareness of and interest in the nation's growing robotics industry.

What is the Congressional Robotics Caucus, you may ask? In between healthcare, budgeting, recessing, showing up on the Colbert Report, and all the other activities our congresscritters are busy with, the bipartisan committee works hard to understand the various fields of robotics and tries to support them through their work in Congress. Most of the reps on the committee come from states with a strong academic or business presence in robotics, like California, Massachusetts, and Pennsylvania, or places with agricultural or manufacturing industries that field a lot of robots. They receive regular briefings from experts in the field to understand what we're doing here in the US and how competitive we are on a global scale.

So here we are at National Robotics Week. Perhaps not coincidentally, this also happens to be the week of the FIRST robotics championship event in Atlanta, one of the largest gatherings of current and aspiring roboticists in the world. But if you don't happen to be there, plenty of other cities -- big and small -- around the country are celebrating with mini-competitions, classes, laboratory open houses, block parties, and more. The full calendar of events is here. You can alternatively roll your own.

There are plenty of officially recognized Days, Weeks, and Months designed to raise awareness that mean very little to most people, but I'm actually really excited about NRW. Robotics has a unique way of engaging people and getting them excited about technology that looks straight out of sci-fi, and using it as a vehicle to get kids into STEM fields and adults into learning about and supporting our industry is really effective. Check out the events in your area, encourage your friends to attend, and don't forget your shirt!

World Robot Population Reaches 8.6 Million

robot population

The world's robot population has reached 8.6 million. That's more than one automaton for every citizen of Austria!

UPDATE: Boing Boing came up with a much better comparison:

World robot population: 8.6 million

World population of high net worth individuals, as of 2009: 8.6 million

Population of New Jersey: 8.7 million

Number of Americans who participated in Pilates last year: 8.6 million

I arrived at the 8.6 million estimate based on data from the latest edition of World Robotics, a great numbers-filled report prepared annually by the International Federation of Robotics, or IFR. The report came out late last year -- I finally had time to take a look at it -- and refers to the robot market up to the end of 2008.

So how are robots doing compared to previous years?

First, some nomenclature. The study divides robots in two categories: industrial robots and service robots. The first category includes welding systems, assembly manipulators, silicon wafer handlers -- you know, that kind of big, heavy, expensive, many-degrees-of-freedom machines. The second category consists of two subcategories: professional service robots (things like bomb-disposal bots, surgical systems, milking robots) and personal service robots (vacuum cleaners, lawn mowers, all sorts of robot hobby kits and toys).

As you can see from the chart above, the number of industrial robots grew to 1.3 million in 2008 from about 1 million in 2007, and service robots grew to 7.3 million from 5.5 million. So for industrial and service robots combined it's a 32 percent increase from 2007 to 2008, and that's huge.

That said, you have to understand the numbers. The World Robotics report doesn't add up industrial and service robots. I do. The report keeps these two categories separate, I believe, because these are very different robots in terms of complexity and cost: an industrial robot can be a multimillion dollar manipulator (like the Kuka KR 1000 titan, below), whereas a service robot can be a $50 dollar toy robot.

Another reason to keep them separate: the total numbers for each category mean different things. The total of industrial robots is for "'worldwide operational stock," or robots actually operational today. On the other hand, the total of service robots consists of units sold up to the end 2008, which includes robots no longer in operation like that first-generation Roomba you harvested for parts long ago.

So why do I add the numbers? Well, because I think it's kind of cool to have a number for the world's robot population.

kuka robotics kr 1000 titanNow on to some highlights from the report. First, industrial robots.

  • According to the report, 2008 sales reached 113,000 units, which is about the same as the previous year. It's a weak result, and the culprit, as you might have guessed, is the global economic meltdown.

  • A breakdown by region. Of the 2008 robot sales, more than half, or about 60,300 units, went to Asian countries (including Australia and New Zealand). The world's largest market, Japan, continues to see a decline, with supply falling by 8 percent to about 33,100 units. But Korea and emerging markets like China and the Southeast Asian countries and India saw increases in sales, with Korea adding 11,600 robots, up 28 percent from 2007, China adding 7,900 units, an increase of 20 percent, and Taiwan's robot acquisitions surging by 40 percent.

  • In the Americas, the robot market grew by 17,200 units, or 12 percent less than in 2007. Auto industry, the main robot buyer, retreated and robot sales plunged.

  • Robot sales in Europe stagnated at about 35,100 units, with Germany taking the lead, adding 15,200 robots, 4 percent more than in 2007. Italy, Europe's second largest market after Germany, added 4,800 units and France, 2,600 robots.

  • So the total of industrial robots in 2008? First, a number that I hadn't seen before. The report says that "total accumulated yearly sales, measured since the introduction of industrial robots in industry at the end of 1960s, amounted to more than 1,970,000 units at the end of 2008." That's basically the total of industrial robots sold in the world. Ever. Cool! So to get the total of industrial robots in operation you need to remove the ones that have been taken out of service. People use different statistical models to do that, arriving at different numbers. The World Robotics report gives an estimate between 1,036,000 and 1,300,000 units.

  • Still according to the report, world industrial robot sales amounted to about US $6.2 billion in 2008. But this amount doesn't include cost of software, peripherals, and system  If you were to add that up, the market would be some three times larger, or around $19 billion.

irobot roombaNow on to service robots.

  • First, some more nomenclature. The World Robotics report differentiates between two kinds of service robots: service robots for professional use and service robots for personal use. That's because the personal ones are sold for much less and are mass produced.

  • According to the report, 63,000 service robots for professional use were sold in 2008, a market valued at $11.2 billion.

  • A breakdown by application: 30 percent (20,000 units) for defense, security, and rescue applications; 23 percent for milking robots; 9 percent for cleaning robots; 8 percent each for medical and underwater robots; 7 percent for construction and demolition robots; 6 percent for robot platforms for general use; and 5 percent for logistic systems.

  • As for service robots for personal use: 4.4 million units sold for home applications (vacuuming and lawn mowing bots) and about 2.8 million for entertainment and leisure (toy robots, hobby systems, and educational bots).

  • And here's an eye opening number: In 2008 alone about 940,000 vacuum cleaning robots (like the iRobot Roomba 562 Pet Series above) were sold, almost 50 percent more than in 2007. That's 1 million new living rooms getting cleaned by robots!

  • Finally, a forecast. The report estimates that 49,000 professional service robots and 11.6 million personal service robots will be sold between 2009 and 2012.

A note about this last bullet point. If we get this forecast and add it up to a little over 1 million industrial robots (their growth is very slow), we'd get a grand total world robot population of nearly 13 million by around 2011 or 2012. That would mean one robot for every person in Zambia. Or Illinois.

As usual, a special thanks go to the IFR statistical department folks for putting this report together.

Riding Honda's U3-X Unicycle of the Future

honda u3-x

honda u3-x It only has one wheel, but Honda's futuristic personal mobility device, called the U3-X, is no pedal-pusher. The unicycle of the future moves as you move, wheeling you to your destination simply by sensing your body tilting this way or that, Segway style.

Honda says the machine is designed for indoor use, but last week, when the company demoed it for us in New York, it worked just fine in Times Square. Watching the Honda engineers riding it around on a Broadway sidewalk was like getting a glimpse of the future.

I also got a chance to try the U3-X -- not on Broadway, but at a hotel conference room nearby. Following the instructions of the cheerful Shin-ichiro Kobashi, the U3-X lead engineer, I hopped on the seat and took a few seconds to orient myself.

After some tentative leaning to test how far I could go without falling, I got a feel for the device and found it simple to navigate. You just lean slightly in the desired direction and off you go.

It's definitely a trip. The footrests are there just for balance, not to steer. Your hands stay free, and you're perched just a little below eye level, at a height that's still natural to talk to someone standing up. Putting your feet down helps stop and turn, especially if you're rapidly approaching a wall in front of you. And the whirring of the machine is so satisfying!

Watch the video to see how easy it is to ride it:

The U3-X uses a balance control system that derives from Honda's research on human walking dynamics for its famed Asimo bipedal humanoid robot.

honda asimoWhen the rider leans his or her body, an angle tilt sensor sends data to the balance control system, which in turns moves the wheel, maintaining balance.

But the amazing thing about the U3-X is not quite visible: its omnidirectional wheel.

The wheel consists of a ring of small rubber wheels overlapping a single large wheel (see illustration below). When the large wheel rotates, the U3-X moves forward or backward. When the small wheels rotate, the machine moves left or right. And when both the large and small wheels turn at the same time, the U3-X moves diagonally.

Honda showed us animations but didn't let us take photos of the wheel itself. It's a really ingenious system that uses only two motors to accomplish all of its movement.

So how fast can it go? Its has a top speed of 6 kilometers per hour, which is a little better than the average walking speed of an adult, and the lithium-ion battery will let you ride around for an hour.

The machine weighs less than 10 kilograms (22 pounds) and max rider weight is currently 100 kg (220 lb).

honda u3-xBecause it's such a narrow device, no wider than the distance between your legs, it won't get in the way of other pedestrians or riders on crowded streets on in an office environment, Kobashi explained.

The seat folds down and the footrests fold up, so it fits in a compact package that looks a bit like a slim boombox. It's not George Jetson's foldable space car, but you can grab it by the handle and roll it around like a suitcase.

But the questions many passersby at Times Square asked were, "Where do I get one," and "How much does it cost?" Alas, Honda doesn't know that yet. Guess we'll have to wait for the future to come.

Some more photos of the device in Times Square and slides from a technical presentation the Honda engineers gave us:

honda u3-x

honda u3-x

honda u3-x

honda u3-x specs

READ ALSO:

Obama Meets Japanese Robots
Mon, November 15, 2010

Blog Post: The president was greeted by humanoid HRP-4C and caressed Paro the robotic seal

 Wii-Controlled Robotic Wheelchair
Thu, April 22, 2010

Blog Post: Japanese researchers have built a robotic wheelchair operated with a Wii game controller

Humanoid Robots Rise
Mon, October 04, 2010

Blog Post: Watch out, Asimo, these new humanoids are on your tail

Iran Unveils Humanoid Robot
Mon, August 16, 2010

Blog Post: IEEE Spectrum obtained exclusive videos showing the humanoid walking, slowly

Meet Geminoid F, a Smiling Female Android

UPDATE2: Just added more photos and a video of the android.

UPDATE: Ishiguro just told me that they won't be able to provide "any private information on the model" who served as the template for Geminoid F and that her identity will be kept "confidential."


Photo: Osaka University

Japanese roboticist Hiroshi Ishiguro unveiled today his latest creation: a female android called Geminoid F. The new robot, a copy of a woman in her twenties with long dark hair, can smile, frown, and change facial expressions more naturally than Ishiguro's previous androids.

Ishiguro, a professor at Osaka University, is famous for creating a robot replica of himself, the Geminoid HI-1, a telepresence android that he controls remotely. The new Geminoid F ("F" stands for female) is also designed to be remote controlled by a human operator.

In a press conference in Osaka, Ishiguro demonstrated how the android could mimic the facial expressions of the woman as she sat in front of a computer with cameras and face-tracking software.

Here's a video I put together:

Ishiguro built the android as part of his work at Osaka University and ATR Intelligent Robotics and Communication Laboratories, with collaboration from Kokoro Co., a Japanese firm that specializes in animatronics and ultrarealistic androids.

In designing Geminoid F, Ishiguro's team and Kokoro engineers wanted to create an android that could exhibit a wide range of natural expressions without requiring as many actuators as other androids they'd developed. In particular, they wanted the robot to sport a convincing smile -- not just any smile but, as Kokore put it, a "toothy smile." And it can also make a frown.

geminoid f
Photos: Osaka University

Whereas the Geminoid HI-1 has some 50 actuators, the new Geminoid F has just 12. What's more, the HI-1 robot requires a large external box filled with compressors and valves. With Geminoid F, the researchers embedded air servo valves and an air servo control system into its body, so the android requires only a small external compressor.

The new design helped reduce the android's cost, said Kokoro, which will sell copies of Geminoid F for about 10 million yen (US $110,000). Ishiguro and his collaborators plan to test the android in hospitals and also show it off at science museums and other venues. 

Ishiguro's previous androids, in addition to his own copy, include replicas of his then four-year-old daughter and of a Japanese TV newscaster. I couldn't find more details about the identity of the Geminoid F's master template, only that she is "one-quarter non-Japanese."

But I agree when Ishiguro says that one of the new android's advantages over his own copy (photo on the right) is that Geminoid F has a friendlier appearance and people will be more eager to interact with it. Would anyone disagree?


Photo: Makoto Ishida for IEEE Spectrum

READ ALSO:

Geminoid F Gets Job as Robot Actress
Thu, November 11, 2010

Blog Post: Is her next stop -- Broadway?

Geminoid F: More Photos of the Android
Tue, April 20, 2010

Blog Post: IEEE Spectrum obtained exclusive images and video of Hiroshi Ishiguro's new android

Who's Afraid of the Uncanny Valley?
Fri, April 02, 2010

Blog Post: To design the androids of the future, we shouldn't fear exploring the depths of the uncanny valley

The Man Who Made a Copy of Himself
April 2010

Article: A Japanese roboticist is building androids to understand humans--starting with himself

 

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More