Automaton iconAutomaton

Virginia Tech's Humanoid Robot CHARLI Walks Tall

romela charli

Dennis Hong, a professor of mechanical engineering and director of Virginia Tech's Robotics & Mechanisms Laboratory, or RoMeLa, has created robots with the most unusual shapes and sizes -- from strange multi-legged robots to amoeba-like robots with no legs at all.

Now he's unveiling a new robot with a more conventional shape: a full-sized humanoid robot called CHARLI, or Cognitive Humanoid Autonomous Robot with Learning Intelligence.

The robot is 5-foot tall (1.52 meter), untethered and autonomous, capable of walking and gesturing.

But its biggest innovation is that it does not use rotational joints.

Most humanoid robots -- Asimo, Hubo, Mahru -- use DC motors to rotate various joints (typically at the waist, hips, knees, and ankles). The approach makes sense and, in fact, today's humanoids can walk, run, and climb stairs. However, this approach doesn't correspond to how our own bodies work, with our muscles contracting and relaxing to rotate our various joints.

Dr. Hong and his students wanted a system based on the human anatomy -- and that they could build in short time and on a small budget. So to generate movement, they engineered an ingenious linkage system of pulleys and springs. This actuation system is much lighter than those of other humanoids, and the team was able to design and built it in 1.5 years with only about US $20,000 and donated software/hardware like LabVIEW and SingleBoard RIO.

Dr. Hong is already working on a new version of the robot, CHARLIE-H, that will use linear actuators to power the new legs. You can see the actuator, the black cylinder, on the photo below:

Linear actuators have been tried in humanoids before but, as far as I know, without much success. So I look forward to seeing how the approach will work out in this case. Will CHARLI be able to walk more naturally than Asimo? Only time will tell.

Dr. Hong, for his part, remains confident he'll be able to improve the overall capabilities of humanoid robots in particular bipedal locomotion.

Or as he put it as CHARLI took its first steps, "One small step for a robot, one giant leap for robotics."

Watch the robot (the current version is called CHARLIE-L) in action:

More photos:

Photos and video: RoMeLa and Virginia Tech

UPDATE: Corrected details about the use of linear actuators, which will be present in an upcoming version of the robot.

Humanoid Robot Mahru Mimics a Person's Movements in Real Time

mahru humanoid robot

Mahru, an advanced biped humanoid developed by the Korea Institute of Science and Technology (KIST) and Samsung Electronics, has a bunch of skills.

Mahru knows its way around a kitchen, popping a snack into the microwave and bringing it to you, as KIST researchers demonstrated when they unveiled the robot's latest version, Mahru Z, early this year.

Mahru can also dance and perform Taekwondo moves. (More on that later.)

Now how do the KIST researchers go about programming Mahru to do all that? I asked this question when I visited KIST a while ago.

Dr. Bum-Jae You, head of KIST's Cognitive Robotics Center, in Seoul, told me that they use two approaches. One involves filming a person with body markers using a traditional optical motion-capture system to track the body movements. The other, which they've been using more recently, relies on a wearable inertial motion-capture suit [photo above].

A person wears the suit while performing various tasks. The movements are recorded and the robot is then programmed to reproduce the tasks while adapting to changes in the space, such as a displaced objects.

But the cool thing is, the capture and reproduction of movements can also take place in real time. When I visited, Dr. You and his students demonstrated this capability using a Mahru III robot.

Watch the demo:

Watch in HD

When the operator moves his arms, Mahru moves its arms. There's virtually no delay. There's a delay, though, in the walking part -- after the operator takes a few steps, it takes some time for the robot to follow suit. But Dr. You told me they're working to do that in real time as well.

There are several telepresence robots out there, and many more should be around soon. But typically an operator has control only over the robot's legs or wheels or head; very few allow for full-body remote operation.

Mahru's arm movements under teleoperation are quite impressive -- fast and precise, and also safe, thanks to force-torque sensors and compliant control. Eventually, Dr. You says, a person will be able to teleperate a robot to accomplish manipulation tasks -- and also walk over to people and shake their hands.

Note on the photo above the operator with the motion-capture suit (behind the robot) extending his right hand -- while the robot does the same.

Dr. You and his team also showed me Mahru's dancing capabilities. This demo involved an earlier version of the Mahru robot [below]. Really cool to see the "guts" of the machine -- and the sticker saying "Dancer."

Watch the dance:

Watch in HD

I asked Dr. You if that was traditional Korean dance. Nope, he said, laughing. The choreography comes from the Wonder Girls.

READ ALSO:

Hubo II Humanoid Robot Is Lighter and Faster, Makes His Creator Proud
Tue, March 30, 2010

Blog Post: The creator of Albert Hubo is back with a new, better -- and less creepy -- humanoid robot

Kojiro Humanoid Robot Mimics Your Musculoskeletal System
Thu, March 04, 2010

Blog Post: University of Tokyo researchers are developing a humanoid that mimics the way our skeleton, muscles, and tendons work to generate motion

Geminoid F: More Video and Photos of the Female Android
Tue, April 20, 2010

Blog Post: IEEE Spectrum obtained exclusive images and video of Hiroshi Ishiguro's new android

Hiroshi Ishiguro: The Man Who Made a Copy of Himself
April 2010

Article: A Japanese roboticist is building androids to understand humans--starting with himself

Mars Escape Online Game Helps To Study Human-Robot Interaction

Researchers at the Personal Robots Group at the MIT's Media Lab have developed a new online game to study human interactions. Mars Escape forms teams of two human players, with one person taking on the role of an astronaut, and the other one controlling a robot on Mars. Human and robot must work together to complete various missions before oxygen runs out.

Mars Escape Game In the current first phase of the study the researchers investigate how human players work as a team. In a second phase later this year the researchers will use the data collected from human players to build a model of how humans interacted for task assignment, action synchronization, and problem solving. The model will then be tested by creating an in-game AI.

The long-term goal of the project is to create robots that are able to assist and work with humans in a natural, predictable and robust way. To this end, the researchers will recreate the game environment in real life this summer at the Boston Museum of Science, and test their behavior model using their MDS robot Nexi.

New Boston Dynamics Videos Reveal Faster BigDog, PETMAN

Boston Dynamics has posted some updated videos of BigDog and PETMAN. As far as I can tell, there isn’t much new going on… BigDog still carries a bunch of stuff, climbs up muddy hills, doesn’t fall down on ice, looks like two guys running around under a tarp, and sounds like a swarm of killer bees. The one new sequence that I noticed shows BigDog running (the definition of running being an airborne gait phase) at 5 mph. At the end of the video, when the hydraulics are run externally and the engine is off, BigDog sounds a lot more reasonable. Unfortunately, it’s hard to beat the power density and instant rechargeability of petroleum-based fuels, so we might be stuck with the bees for a while longer.

PETMAN is moving a bit more briskly as well, reaching a walking speed of 4.4 mph. Although it’s dynamically balancing itself, it still looks to me like it’s perpetually on the verge of falling over, but I guess arguably that’s what dynamic balancing is all about. Remember that eventually Petman is supposed to be able to crawl, sweat, and do ‘calisthenics’ to test protective clothing. And when I say eventually, I mean by 2011, but that seems a little bit optimistic at this point. Artificial fingers crossed!

[ Boston Dynamics ]

The Robots Podcast: 50 Years of Robotics (Part 1)

The Robots Podcast - 50 Years of RoboticsI'm not a fan of self-promotion, but I believe that this may be of general interest: The Robots podcast (I am a founder, now run by my colleague Sabine Hauert) is celebrating its 50th episode today. For the occasion, Robots has interviewed 12 experts from a variety of robotic backgrounds on the topic of "The Past and the Next 50 Years of Robotics". Here is the line up of interviewees for the first part of the two part series:

Part 2 of this series will air in 2 weeks and give a snapshot view of the past and next 50 years in Nano Robotics, Artificial Intelligence, Flying Robots, Human-Robot Interaction, Robot Business, and Space Robots. Tune in!

PS: Coinciding with its 50th episode, the Robots podcast has also just launched its new website and forum.

Personal Mobility Robot Operated by Wii Controller

The Personal Mobility Robot, or PMR, is a nimble robotic wheelchair that self-balances on two wheels like a Segway. The machine, based on a platform developed by Toyota, has a manual controller that the rider uses to change speed and direction [see photo, inset].

Now two University of Tokyo researchers have decided to upgrade the machine, making it controllable by a Wii remote controller. Why drive the thing with a Wii-mote? Well, why not?

Last year, when I visited the JSK Robotics Laboratory, part of the university's department of mechano-informatics and directed by Professor Masayuki Inaba, researchers Naotaka Hatao and Ryo Hanai showed me the PMR under Wii control. They didn't let me ride it while they're piloting the machine, but it was fun to watch.

Watch:

Watch in HD here.

The PMR project is part of the Information and Robot Technology (IRT) research initiative at the University of Tokyo. Researchers developed the machine to help elderly and people with disabilities remain independent and mobile. The machine is designed to be reliable and easy to operate, being capable of negotiating indoor and outdoor environments -- even slopes and uneven surfaces. It weighs 150 kilograms and can move at up to 6 kilometers per hour.

The PMR is a type of robot known as a "two-wheeled inverted pendulum mobile robot" -- like the Segway and many others. The advantage of a self-balancing two-wheeled machine is its smaller footprint (compared to, say, a four-wheeled one) and its ability to turn around its axis, which is convenient in tight spaces.

The machine assumes a lower configuration to allow a rider to climb on the seat. Then it raises itself, allowing the two wheels to dynamically balance the vehicle. (The two little wheels you see at the front and at the back are for safety, in case the machine tips over.)

In addition to Wii-mote controllability, the JSK researchers have been working on an advanced navigation system that is able to localize itself and plan trajectories, with the rider using a computer screen to tell the robot where to go [photo, right].

The navigation system runs on two laptop computers in real time, one for localization and the other for trajectory planning. Laser range sensors and SLAM algorithms detect people and objects nearby, and it can distinguish between static and moving obstacles. It does that by successively scanning its surroundings and comparing the scans, which allows it to detect elements that are moving as well as occluded areas that only become "visible" as the robot moves.

The system can rapidly detect pedestrians who suddenly start to move as well as people appearing from blind spots. In those cases, the robot can do two things: recompute the trajectory to avoid a collision [image below, left, from a paper they presented at last year's IEEE RO-MAN conference] or stop for a few seconds, wait for the pedestrian, and then start moving again [below, right].

The navigation system uses a deterministic approach to plan the trajectory. Basically it assigns circles to all vertexes of static objects and then tries to draw a continuous line that is tangent to the circles, going from origin to destination. Of course, there might be a lot of possible routes, so the system uses a A* algorithm to determine the path to be taken. You can see a visual representation of this approach below:

And although I didn't get to see it, researchers told me they're also developing a PMR model specific for indoor use. It's lighter (45 kg) and more compact and the rider can control it by shifting his or her body, just like the Segway.

That means, the researchers say, that you can ride it hands-free: Just tell it where to go and enjoy the ride while sipping a drink or reading a book.

Photos: Information and Robot Technology/University of Tokyo, JSK Robotics Lab

How Recycling Robots Could Help Us Clean the Planet


Dustbot, a garbage-collecting robot created by the Scuola Superiore Sant'Anna's CRIM Lab.
Photo: Massimo Brega

At the current rate of global population growth and consumption of resources, it appears clear to me where we're going to end: in a waste-covered Earth like that depicted in the movie WALL-E.

Needless to say recycling is one of the most important things we can do to keep our planet sustainable. I think it won't be long until governments all over the world create all kinds of incentives to improve recycling.

Which brings us to ... robots!

Recycling is a very promising area for robotics. Over the next few decades I imagine a future where waste-collecting robots will be moving through air, land, and water, reaching difficult areas to help us cleaning our environment. Picture WALL-E but before the whole planet becomes a landfill.

In fact, there are already some recycling bot prototypes roaming around. One example is Dustbot, a robot developed at the Scuola Superiore Sant'Anna's CRIM Lab, in Pisa, Italy. Led by Prof. Paolo Dario, the laboratory created a robot designed specifically to collect garbage at people's homes.

It's 1.5 meter tall, weighs 70 kilograms and can carry 80 liters or 30 kg of payload. The robot can travel at 1 meter per second and its battery gives it 16 kilometers of autonomy.


Photo: Massimo Brega

Accordingly to this BBC story the Dustbot can be summoned to your address through a mobile phone at any time of the day. Basically the machine -- built using a Segway Robot Mobility Platform -- uses a GPS system and motion sensors to drive around the city and show up at your doorstep.

Once it arrives, the user just selects the type of garbage he wants to dispose using a touch screen. A compartment opens on the robot's belly where the user places the garbage, which is them transported to a drop-off location.

The robot's greatest advantage is its size: it can navigate through narrow streets and alleys where normal garbage trucks can't go.

Here's a video showing how Dustbot -- and its "siblings" DustCart and DustClean robots -- work:


Another example is Push, a robot that patrols the streets of Disney World, asking people to feed it with rubbish. Well, it's not exactly a robot -- it's a remote-controlled garbage can. An operator drives it through the crowd, using a speaker system to talk to people, persuading them to recycle their garbage.

Watch it in action in the video below.


It's not WALL-E, but it's funny and efficient, and if it could be made truly autonomous, this simple robot -- along with an army of Dustbots and similar machines -- would be a powerful way of keeping the streets, and hopefully the planet, a bit cleaner.

Do you know of other recycling robots? Let us know.

UPDATED 04/22/10: Dustbot specs added.

Geminoid F: More Video and Photos of the Female Android

geminoid f
Photos: Osaka University (left); Osaka University and Kokoro Company (right); composite (middle).

Geminoid F, the female android recently unveiled by Hiroshi Ishiguro, a roboticist at Osaka University and ATR famous for his ultra-realistic humanlike androids, generated a lot of interest. Several people wrote me asking for more details and also more images. So here's some good news. I got some exclusive photos and video of Geminoid F, courtesy of Osaka University, ATR Intelligent Robotics and Communication Laboratories, and Kokoro Company. Below is a video I put together giving an overview of the project.


Watch in HD here.

And here are some more photos of the android. The first one below is a composite I created using the two photos right beneath it. It shows how the android's silicone body hides all the mechanical and electronics parts.

geminoid f
Composite based on photos below. Notice that the robot's body is not in the exact same position in the two images, so the composite is not a perfect match; also, I had to flip the robot skeleton image to get the right angle, creating a mirrored image that obviously doesn't correspond to reality.

geminoid f
Photos: Osaka University and Kokoro Company; Osaka University

Here's a Kokoro engineer working on the android's face. Ishiguro and Kokoro have long been collaborators, creating several humanlike androids that include the Geminoid HI-1 and Repliee Q1 and Q2.

geminoid f
Photo: Osaka University and Kokoro Company

In developing Geminoid F, Ishiguro paid particular attention to the facial expressions. He wanted an android that could exhibit a natural smile -- and also a frown.

geminoid f
Photos: Osaka University

The android is a copy of a woman in her twenties. Ishiguro told me that her identity will remain "confidential."


Photo: Osaka University

geminoid f
Photo: Osaka University

Here's Geminoid F meeting Geminoid HI-1.

geminoid f
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

geminoid f and geminoid hr-1
Photo: Osaka University and ATR Intelligent Robotics and Communication Laboratories

This one below shows the woman teleoperating the android. A vision system captures her mouth and head movements, reproducing those movements on the android. The woman can also use the mouse to activate certain behaviors.

geminoid f
Photo: Osaka University

So tell us: Was Ishiguro able to leap over the abyss of the uncanny valley?

Bandit, Little Dog, and More: University of Southern California Shows Off Its Robots


Bandit, a caregiving humanoid robot developed at USC's Interaction Lab

Last Thursday, I headed out to the University of Southern California campus in Los Angeles for an open house at the Center for Robotics and Embedded Systems (CRES). It was a great opportunity to see some amazing research on humanoid robots, robots learning from humans, machine learning, and biologically inspired robots. Some highlights:

Let's start at the Interaction Lab led by Dr. Maja J. Mataric, a professor of computer science, neuroscience, and pediatrics and director of CRES. Her lab focuses on human-robot interaction, specifically with the goal of developing "socially assistive systems" to help in convalescence, rehabilitation, training, education, and emergency response. (Spectrum recently ran a profile of Mataric, read here.) 

Ross Mead, a graduate student in Mataric's group, is currently working with children with autism through USC's Center for Autism Research in Engineering (CARE). Children with autism tend to interact more easily with robots than with humans. So Dr. Mataric’s group has been exploring the use of socially assistive robots in conjunction with speech processing technology to help improve social communication skills of the children.

Image courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Current results have shown improved speech and interaction skills in autistic children when presented with robots, such as their caregiving robot named Bandit. It has 6-DOF arms and a head than can pan and tilt, with a face with movable mouth and eyebrows, and stereo-cameras for eyes.

In another application, Bandit serves as a social and cognitive aid for the elderly. It will not only instruct the user to perform certain movements, but also motivate the person and ensure that each movement is performed correctly.

Below is a video of Bandit showing off USC colors and interacting with graduate student Juan Fasola (and here's a video with an overview of the project).


Video courtesy of Dr. Maja J. Mataric and USC Interaction Lab

Another student at the Interaction Lab, Ross Mead is studying what aspects of robotic design create a more humanlike appearance and that improve acceptance of robots by humans. This has involved Sparky (below), a “minimatronic figure” developed by Walt Disney Imagineering Research and Development. The robot has 18 degrees of freedom and uses small servos and tendon-driven mechanisms to reproduce humanlike motions.

One possible application for Sparky will be as a lab tour guide. Equipped with a mobile base, it should be able to stop at various parts of the lab and describe using speech and gestures the various projects.

Watch the video below to see how Sparky uses its tendons and a spring as a spine to try to achieve natural movements:



Next up is the Computational Learning and Motor Control Lab headed by Dr. Stefan Schaal, a professor of computer science and neuroscience.

As part of the DARPA Learning Locomotion program, Schaal and his colleagues are investigating legged locomotion with the quadruped robot Little Dog developed by Boston Dynamics, whose other robots include the also quadruped Big Dog, the LS3 robot mule, and biped bot PETMAN.

Legged robots have the potential to navigate more diverse and more complex terrain than wheel-based robots, but current control algorithms hinder their application. So Schaal’s group is using Little Dog as a platform for learning locomotion in which learning algorithms developed with Little Dog will enable robots to transverse large, irregular and unexpected obstacles.

I had the opportunity to speak with Dr. Jonas Buchli and Peter Pastor of Dr. Schaal’s group following a demonstration of Little Dog. They discussed potential applications that include survivor location and recovery after a disaster, prosthetic limbs, and space exploration.

Watch the video below to see Little Dog in action (and watch this other video to see the little bot performing even more maneuvers).


Finally, at USC's iLab, Dr. Laurent Itti, a professor of computer science, is investigating how to make robots interact more naturally with humans and more effectively integrate into our lives. For that to happen, it will be important to create robots with humanlike qualities. In other words, robots will have to demonstrate humanlike locomotion, facial expressions, and eye movement. In addition, as robots gradually leave controlled environments, such as factory floors, and enter environments populated by humans, they’ll need enhanced cognitive abilities that enable them to autonomously navigate in an unstructured environment. One way of achieving that is by looking at biology.

One of the lines of research Itti and his students are pursuing involves monitoring the gaze of human participants as they watch a movie or play a video game. Such research will provide a window into how the brain functions as well as how it may become altered in diseased states. Furthermore, insights into brain function gleaned from the research has applications in machine vision, image processing, robotics, and artificial intelligence. Dr. Itti is also investigating the application of biologically inspired visual models for automatic target detection in a cluttered environment, driver alert monitoring, autonomous robotic navigation, and video games.

His group launched the Beobot 2.0 project to create an integrated and embodied artificial intelligence system and, through providing open access to their hardware and software design, enable other research groups to build other robots with diverse capabilities. Below is a picture of Beobot 2.0, and you can watch a video here to see it navigating a corridor.

beobot ilab usc
Image courtesy of Dr. Laurent Itti and USC's iLab

With the expected increase in the robot population over the next decades, robots will emerge as a prevalent force in our lives and will permeate environments beyond manufacturing and include everything from healthcare and emergency response to personal entertainment and services. While providing many benefits, robots will become part of society, raising new and unforeseen social and ethical questions that will, in effect, give us a better understanding of ourselves and what it means to be human.

In the meantime, what's my Roomba doing?

Daniel Garcia is an intern at Lux Capital and is interested in clean technology and innovations in healthcare. He holds a PhD in biomedical engineering from UCLA.

Most Commented Posts

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Advertisement
Load More