Automaton iconAutomaton

Building a Super Robust Robot Hand

dlr hand arm system

German researchers have built an anthropomorphic robot hand that can endure collisions with hard objects and even strikes from a hammer without breaking into pieces.

In designing the new hand system, researchers at the Institute of Robotics and Mechatronics, part of the German Aerospace Center (DLR), focused on robustness. They may have just built the toughest robot hand yet.

The DLR hand has the shape and size of a human hand, with five articulated fingers powered by a web of 38 tendons, each connected to an individual motor on the forearm.

The main capability that makes the DLR hand different from other robot hands is that it can control its stiffness. The motors can tension the tendons, allowing the hand to absorb violent shocks. In one test, the researchers hit the hand with a baseball bat—a 66 G impact. The hand survived.

The video below shows the fingers moving and the hand getting hit by a hammer and a metal bar:

The DLR team didn’t want to build an anatomically correct copy of a human hand, as other teams have. They wanted a hand that can perform like a human hand both in terms of dexterity and resilience.

The hand has a total of 19 degrees of freedom, or only one less than the real thing, and it can move the fingers independently to grasp varied objects. The fingers can exert a force of up to 30 newtons at the fingertips, which makes this hand also one of the strongest ever built.  

Another key element in the DLR design is a spring mechanism connected to each tendon. These springs [photo left] give the tendons, which are made from a super strong synthetic fiber called Dyneema, more elasticity, allowing the fingers to absorb and release energy, like our own hands do. This capability is key for achieving robustness and for mimicking the kinematic, dynamic, and force properties of the human hand.

During normal operation, the finger joints can turn at about 500 degrees per second. By tensioning the springs, and then releasing their energy to produce extra torque, the joint speed can reach 2000 degrees per second. This means that this robot hand can do something few others, if any, can: snap its fingers.

Why build such a super strong hand?

Markus Grebenstein, the hand's lead designer, says that existing robot hands built with rigid parts, despite their Terminator-tough looks, are relatively fragile. Even small collisions, with forces of a few tens of newtons, can dislodge joints and tear fingers apart.

“If every time a robot bumps its hand, the hand gets damaged, we’ll have a big problem deploying service robots in the real world,” Grebenstein says.

To change its stiffness, the DLR hand uses an approach known as antagonistic actuation. The joints of each finger [photo below] are driven by two tendons, each attached to one motor. When the motors turn in the same direction, the joint moves; when they turn in opposite directions, the joint stiffens.

Other hands, such as the Shadow hand designed in the U.K., also use antagonistic actuation. But the Shadow uses pneumatic artificial muscles, which have limitations in how much they can vary their stiffness.

Before developing the new hand, Grebenstein designed the hand of another advanced robot, the humanoid Justin. He says that in one experiment they would throw heavy balls and have Justin try to catch them. “The impact would strain the joints beyond their limits and kill the fingers,” he says.

The new hand can catch a ball thrown from several meters away. The actuation and spring mechanisms are capable of absorbing the kinetic energy without structural damages.

But the hand can’t always be in a stiff mode. To do manipulation tasks that require accuracy, it’s better to have a hand with low stiffness. By adjusting the tendon motors, the DLR hand can do just that.

To operate the hand, the researchers use special sensor gloves or simply send grasping commands. The control system is based on monitoring the joint angles. It doesn’t need to do impedance control, Grebenstein says, because the hand has compliance within the mechanics.

To detect whether an object is soft and must be handled more gently, the hand measures force by keeping track of the elongation of the spring mechanisms.

“In terms of grasping and dexterity, we’re quite close to the human hand,” he says, adding that the new hand is “miles ahead” of Justin’s hands.

About 13 people have worked on the hand, and Grebenstein insists it’s hard to estimate the cost of the project. But he says that the hardware for one hand would cost between 70,000 and 100,000 euros.

The researchers are now building a complete two-arm torso called the DLR Hand Arm System. Their plan is to study innovative grasping and manipulation strategies, including bimanual manipulations. 

Grebenstein hopes that their new approach to hand design will help advance the field of service robots. He says that current robot hardware has limited new developments, because it's costly and researchers can't afford to do experiments that might damage them.

“The problem is," he says, "you can’t learn without experimenting.”

More photos:

Images: DLR


Robots With Knives
Thu, May 06, 2010

Blog Post: What would happen if a knife-wielding robot struck a person?

Top 20 Robot Videos of 2010
Tue, January 11, 2011

Blog Post: Quadrotors performing acrobatics, ultrarealistic humanoids dancing, dexterous robots folding towels, and more

DARPA's Manipulation Plans
Mon, October 18, 2010

Blog Post: An ambitious four-year program aims at transforming robotic manipulation from art into science

Superfast Robot Eyes
Mon, November 01, 2010

Blog Post: German researchers have developed robotic eyes that move at superhuman speeds

Cloud Robotics: Connected to the Cloud, Robots Get Smarter

cloud robotics google cellbot
Image: Cellbots

In the first “Matrix” movie, there’s a scene where Neo points to a helicopter on a rooftop and asks Trinity, “Can you fly that thing?” Her answer: “Not yet.” Then she gets a “pilot program” uploaded to her brain and they fly away.

For us humans, with our non-upgradeable, offline meat brains, the possibility of acquiring new skills by connecting our heads to a computer network is still science fiction. Not so for robots.

Several research groups are exploring the idea of robots that rely on cloud-computing infrastructure to access vast amounts of processing power and data. This approach, which some are calling "cloud robotics," would allow robots to offload compute-intensive tasks like image processing and voice recognition and even download new skills instantly, Matrix-style.

Imagine a robot that finds an object that it's never seen or used before—say, a plastic cup. The robot could simply send an image of the cup to the cloud and receive back the object’s name, a 3-D model, and instructions on how to use it, says James Kuffner, a professor at Carnegie Mellon currently working at Google

Kuffner described the possibilities of cloud robotics at the IEEE International Conference on Humanoid Robots, in Nashville, Tenn., this past December. Embracing the cloud could make robots “lighter, cheaper, and smarter,” he said in his talk, which created much buzz among attendees.

For conventional robots, every task—moving a foot, grasping an object, recognizing a face—requires a significant amount of processing and preprogrammed information. As a result, sophisticated systems like humanoid robots need to carry powerful computers and large batteries to power them.

According to Kuffner, cloud-enabled robots could offload CPU-heavy tasks to remote servers, relying on smaller and less power-hungry onboard computers. Even more promising, the robots could turn to cloud-based services to expand their capabilities.

As an example, he mentioned the Google service known as Google Goggles. You snap a picture of a painting at a museum or a public landmark and Google sends you information about it. Now imagine a “Robot Goggles” application, Kuffner suggested; a robot would send images of what it is seeing to the cloud, receiving in return detailed information about the environment and objects in it.

Using the cloud, a robot could improve capabilities such as speech recognition, language translation, path planning, and 3D mapping.

The idea of connecting a robot to an external computer is not new. Back in the 1990s, Masayuki Inaba at the University of Tokyo explored the concept of a “remote brain,” as he called it, physically separating sensors and motors from high-level “reasoning” software.

Now cloud robotics seeks to push that idea to the next level, exploiting the cheap computing power and ubiquitous Net connectivity available today.

Kuffner, who currently works on Google’s self-driving car project, realized that running computing tasks on the cloud is often much more effective than trying to do it locally. Why can’t robots do the same?

As a side project, he's now exploring a variety of cloud robotics ideas at Google, including "using small mobile devices as Net-enabled brains for robots,” he told me. "There is an active group of researchers here at Google who are interested in cloud robotics," he says.

Last month, some of his colleagues unveiled their Android-powered robot software and a small mobile robot dubbed the cellbot [see image above]. The software allows an Android phone to control robots based on Lego Mindstorms and other platforms.

An app store for robots

But cloud robotics is not limited to smartphone robots. It could apply to any kind of robot, large or small, humanoid or not. Eventually, some of these robots could become more standardized, or de facto standards, and sharing applications would be easier. Then, Kuffner suggested, something even more interesting could emerge: an app store for robots.

The app paradigm is one of the crucial factors behind the success of Apple’s iPhone and Google’s Android. Applications that are easy to develop, install, and use are transforming personal computing. What could they do for robotics?

It’s too early to say. But at the Nashville gathering, attendees received Kuffner’s idea with enthusiasm.

“The next generation of robots needs to understand not only the environment they are in but also what objects exist and how to operate them,” says Kazuhito Yokoi, head of the Humanoid Research Group at Japan's National Institute of Advanced Industrial Science and Technology (AIST).  “Cloud robotics could make that possible by expanding a robot’s knowledge beyond its physical body.”

“Coupling robotics and distributed computing could bring about big changes in robot autonomy,” said Jean-Paul Laumond, director of research at France’s Laboratory of Analysis and Architecture of Systems, in Toulouse. He says that it’s not surprising that a company like Google, which develops core cloud technologies and services, is pushing the idea of cloud robotics.

But Laumond and others note that cloud robotics is no panacea. In particular, controlling a robot’s motion—which relies heavily on sensors and feedback—won’t benefit much from the cloud. “Tasks that involve real time execution require onboard processing,” he says.

Stefan Schaal, a robotics professor at the University of Southern California, says that a robot may solve a complex path planning problem in the cloud, or possibly other optimization problems that do not require strict real-time performance, "but it will have to react to the world, balance on its feet, perceive, and control mostly out of local computation."

And there are other challenges. As any Net user knows, cloud-based applications can get slow, or simply become unavailable. If a robot relies too much on the cloud, a problem could make it "brainless."

Kuffner is optimistic that new advances will make cloud robotics a reality for many robots. He envisions a future when robots will feed data into a "knowledge database," where they'll share their interactions with the world and learn about new objects, places, and behaviors.

Maybe they'll even be able to download a helicopter pilot program?

Below are some other examples of cloud robotics projects:

• Researchers at Singapore's ASORO laboratory have built a cloud computing infrastructure to generate 3-D models of environments, allowing robots to perform simultaneous localization and mapping, or SLAM, much faster than by relying on their onboard computers. The backend system consists of a Hadoop distributed ¿le system that can store data from laser scanners, odometer data, or images/video streams from cameras. The researchers hope that, in addition to SLAM, the cluster could also perform sensor fusion and other computationally intensive algorithms.

• At LAAS, Florent Lamiraux , Jean-Paul Laumond, and colleagues are creating object databases for robots to simplify the planning of manipulation tasks like opening a door. The idea is to develop a software framework where objects come with a "user manual" for the robot to manipulate them. This manual would specify, for example, the position from which the robot should manipulate the object. The approach tries to break down the computational complexity of manipulation tasks into simpler, decoupled parts: a simpli¿ed manipulation problem based on the object's "user manual," and a whole-body motion generation by an inverse kinematics solver, which the robot's computer can solve in real time.

Gostai, a French robotics firm, has built a cloud robotics infrastructure callled GostaiNet, which allows a robot to perform speech recognition, face detection, and other tasks remotely. The small humanoid Nao by Aldebaran Robotics will use GostaiNet to improve its interactions with children as part of research project at a hospital in Italy. And Gostai's Jazz telepresence robot uses the cloud for video recording and voice synthesis.

• At present the iCub humanoid project doesn't rely on "cloud robotics," but Giulio Sandini, a robotics professor at the Italian Institute of Technology and one of the project's leaders, says it's "a precursor of the idea." The iCub, an open child-sized humanoid platform, works as a "container of behaviors," Sandini says. "Today we share simple behaviors, but in the same way we could develop more complex ones like a pizza making behavior, and our French collaborators could develop a crepes making behavior." In principle, you'd just upload a "behavior app" to the robot and it would cook you pizzas or crepes.

[If you know of other cloud robotics projects, let me know.]

And here's Kuffner's powerpoint presentation:  

Cloud Enabled Robots

Autom, the Robot That Helps You Lose Weight

autom robot weight loss coach

Autom wants to make you healthier. This little robot keeps track of your eating and exercise habits -- and encourages you to stay in shape.

Autom speaks with a synthetic female voice, and you interact with it using its touch-screen belly. It won't scold you if you ate two desserts last night; Autom is a very kind robot.

But can it really help you lose weight?

We met Autom, and one of its creators, Cory Kidd, co-founder and CEO of Intuitive Automata, at CES early this month.

Kidd claims that, yes, Autom can help people lose weight. The robot is more effective than weight-loss websites and smartphone apps, he says, because people develop a bond with the robot and stick with it longer.

Kidd started developing Autom a few years ago while a grad student at MIT, and with two colleagues he founded Intuitive Automata, which is based in Hong Kong, to commercialize the robot.

Watch Kidd explaining how Autom works:

I think they are onto something here, but I see some limitations in the current robot. First, the speech synthesis is very robotic. Second, the robot has no voice recognition at all. it would be nice if the robot could speak more naturally and if at least basic interactions -- like answering "yes" or "no" -- could happen via voice. The good thing is the company might be able to improve these features in the future with software updates.

Another question is whether consumers want a robotic weight-loss coach in the first place, and how much they're willing to shell out.

Intuitive Automata plans to start selling Autom on its website later this year for around US $500 or $600. But in the video Kidd mentions something interesting: They plan to sell the robot also via health insurance companies and employers, which would give -- or subsidize -- the robots to customers and employees.

Would you take Autom home?

Photo and video: Josh Romero & Joe Calamia/IEEE Spectrum 


iRobot Scooba 230: How It Works
Fri, January 14, 2011

Blog Post: iRobot shrunk the Scooba. How did they do it?

Robot Suit HAL Demo at CES 2011
Sun, January 09, 2011

Blog Post: A man turns into a cyborg at the Consumer Electronics Show

The Best Robots of CES 2011
Tue, January 11, 2011

Blog Post: Robots made a big appearance at this year's Consumer Electronics Show in Las Vegas

Top 20 Robot Videos of 2010
Tue, January 11, 2011

Blog Post: Quadrotors performing acrobatics, humanoids dancing, dexterous robots folding towels, and more

Windoro Window-Cleaning Robot Demo

windoro robot

In our best robots of CES roundup last week, it appears that we left out an interesting offering: the Windoro window-cleaning robot from South Korea.

That's right. This robot wants to do for your windows what Roomba and Scooba do for your floors. It's quite a sight to see this gizmo magically crawling on glass.

But there's no magic, of course. There's magnetism. The robot consists of two modules that go on opposite sides of the window and hold each other using permanent magnets. 

Watch how it works:

The mighty iRobot, with its best-selling Roomba vacuums and innovative Scooba floor-washing bots, dominates the cleaning-robot market. But now Ilshim Global, a small firm from Gyeongsan, South Korean, wants to claim a new part of that market -- the vertical segment, so to speak.

Unveiled late last year, the Windoro robot was a joint development between Ilshim and the Pohang Institute of Intelligent Robotics, or PIRO. The machine measures about 20 centimeters (7.9 inches) on a side and weighs in at 2.7 kilograms (6 pounds). 

The Windoro robot can clean windows 6 to 25 millimeters thick (0.2 to 1 inch). And no, you won't see it hanging on skyscrapers -- its creators say it's designed for cleaning windows at homes and stores.

One of the robot's two modules works as the navigation unit. It uses accelerometers to navigate and bump sensors to detect obstacles and window frames. The other module is the cleaning unit, which has four spinning microfiber pads and a reservoir that dispenses detergent.

The robot first moves up and down and left and right to determine the dimensions of the window. It then follows a zigzag pattern to cover the entire surface, moving at an average speed of 8 centimeters per second and returning to the starting point when it's finished.

One battery charge lasts about 2 hours, and the robot can clean a surface of up to 12 square meters (130 square feet).

The Windoro robot will first go on sale in South Korea, followed by Europe, over the next couple of months. It should be available in the United States in April and will retail for about US $400.

UPDATE 1/19: Corrected maximum surface area robot can clean.

Photo and video: Josh Romero & Joe Calamia/IEEE Spectrum 


iRobot Scooba 230: How It Works
Fri, January 14, 2011

Blog Post: iRobot shrunk the Scooba. How did they do it?

Robot Suit HAL Demo at CES 2011
Sun, January 09, 2011

Blog Post: A man turns into a cyborg at the Consumer Electronics Show

The Best Robots of CES 2011
Tue, January 11, 2011

Blog Post: Robots made a big appearance at this year's Consumer Electronics Show in Las Vegas

Top 20 Robot Videos of 2010
Tue, January 11, 2011

Blog Post: Quadrotors performing acrobatics, humanoids dancing, dexterous robots folding towels, and more

Autonomous Quadrotor Teams May Build Your Next House

Back in July, we wrote about how UPenn’s GRASP Lab had taught their quadrotors to work together to grasp and move things. The next step, it seems, is teaching the quadrotors to work together to grasp and move things and actually build buildings. The video above shows a team of quadrotors cooperating to construct the framework of a (rather small) building. The building’s structure is held together with magnets, and the quadrotors are able to verify that the alignment is correct by attempting to wiggle the structural components around, which is pretty cool.

It’s fun to speculate about how this technology might grow out of the lab into the real world… To build actual buldings, you’d either need much bigger quadrotors (which is possible), lots of small quadrotors cooperating in big pieces (also possible), or buildings built out of much smaller components (which might be the way to go). The quadrotors probably wouldn’t be able to do all the work, but they have the potential to make construction projects significantly more efficient.


iRobot Scooba 230: How It Works

I would love to have a Scooba, iRobot's floor-washing robot, to keep my kitchen and bathroom shining. But for my New York City-sized dwelling (read: tiny cramped apartment), the rotund robot is an overkill -- it could probably clean the entire bathroom floor just by spinning in place.

It appears that iRobot heard the same complaint from many people and decided to shrink the Scooba. The new Scooba 230, unveiled at CES, is about the same height but only half the diameter of the original Scooba 300 series [see photo above]. At 16 centimeters in diameter (6.5 inches) and 9 centimeters tall (3.5 inches), the new Scooba can get into small areas such as that dreaded space around the toilet.

Like the original models, the shrunken Scooba uses a three-stage cleaning approach: first, it deposits water or a cleaning solution on the floor; then it uses scrubbing brushes to lose dirt and grime; finally, a squeegee vacuum removes the dirty water. The Scooba 230 is designed to clean up to 14 square meters (150 square feet) of tile, linoleum, or sealed hardwood floors in a single session, while the larger Scooba units can clean from 23 to 80 square meters (250 to 850 square feet), depending on the model. Watch the video below to see how the Scooba 230 works:

And how did iRobot engineers manage to shrink the robot and still allow it to clean a sizable area?

irobot scooba 230 water management system

The trick is the robot uses the same internal volume to store both clean and dirty water. The two are separated by a flexible membrane and never get mixed; as the clean water goes out, the membrane makes more room for the dirty water coming in from the squeegee vacuum [see illustration].

In terms of navigation software, the Scooba uses the same approach as the Roomba to make sure it covers an entire area, following walls, going around obstacles, and driving over the same spot multiple times -- and sensors below the front bumper prevent it from falling down stairs and other drop-offs.

The new Scooba 230 will be available this spring in the United States and will cost $300. We plan to post an in-depth review soon.

Interview: iRobot's AVA Tech Demonstrator

With all of the new competition in the consumer robotics field, it’s about time for iRobot to show that they’re still capable of innovating new and exciting things. AVA, their technology demonstrator, definitely fits into the new and exciting category.

AVA is short for ‘Avatar,’ although iRobot was careful not to call it a telepresence robot so as not to restrict perceptions of what it's capable of. AVA is capable of fully autonomous navigation, relying on a Kinect-style depth sensing camera, laser rangefinders, inertial movement sensors, ultrasonic sensors, and (as a last resort) bump sensors. We got a run-down a few days ago at CES, check it out:

All of the sensor data crunching is taken care of by a heavyweight on-board computer, but the brains of the operation is really whatever AVA happens to be wearing for a head, in this case, a tablet PC. This makes it easy to develop applications to control the robot, which is a concept not unlike the iRobot Create: the building a robot part is done for you, leaving you to focus on getting said robot to do cool stuff.

There are also a bunch of interesting ways to interact with AVA. You’ve got the tablet of course, if you want to do things the hard way. A second Kinect camera on the bot can detect people and recognize gestures, and an array of microphones can detect and interpret voice commands. Finally, AVA’s round ‘collar’ piece has touch sensors all the way around, offering an intuitive way to steer AVA around.

While iRobot wouldn’t speculate on what’s coming next for AVA (disappointing), telepresence is an obvious first application. AVA also has a bunch of expansion ports that you can attach stuff to, which obviously makes me think manipulators. Personally, I’m hoping that now that AVA is out in the open, iRobot will keep us updated with some of the new ideas that they’re playing around with.

[ iRobot ]

The Best Robots of CES 2011

irobot scooba 230

Robots made a big appearance at this year’s Consumer Electronics Show in Las Vegas. There were home robots, robotic pets, humanoids, telepresence systems, and even a little robot to massage people’s backs. Check out the highlights:


iRobot brought two new home robots to CES: a more powerful Roomba and a smaller Scooba washer [see photo above]. According to the company, the updated Roomba 700 series is 20 percent better at sucking up fine dirt particles and new power management software provides 50 percent longer battery life than previous Roomba generations. The new vacuum units start at US $450. The new Scooba 230, priced at $300, is 16 centimeters in diameter and 9 cm high, ideal to get into small areas such as that dreaded space around the toilet. According to the company, Scooba differs from a mop because it only uses clean solution to wash the floors, not dirty water. The robot has an active reservoir that keeps the cleaning solution and dirty water separate and it can clean 14 square meters of linoleum, tile, or sealed hardwood floors in a single session.


• iRobot was also showing off a telepresence robot prototype called AVA, which looks like an iPad on wheels. It seems that after its aborted ConnectR project -- a telepresence robot based on the Roomba platform -- iRobot is trying to catch up in the telepresence arena. The AVA prototype was quite bulky and didn't move much, but the interesting thing is that iRobot wants to allow developers to create apps to make the robot do useful things. [UPDATE: Okay, iRobot is not calling its prototype a telepresence robot, although AVA is short for avatar. BotJunkie has the details.] Watch iRobot CEO Colin Angle explaining the idea behind AVA:


Paro, the therapeutic robot seal, was drawing lots of visitors who wanted to caress the furry creature, but another therapeutic robot was also getting a lot of attention -- and it was the robot that was caressing people. The WheeMe, created by Israeli company DreamBots, uses tilt sensors to balance on a person's back, moving slowly as its four sprocket-like rubber wheels press gently on the skin. As we wrote before, the company admits that the robot can't give you a deep tissue massage, because it's very light (240 grams, or 8.5 ounces), but it claims the device can provide "a delightful sense of bodily pleasure." It will retail for $69. 


• The Fujitsu Emotion Bear is a robotic teddy bear with a camera in its nose, motors stuffed in its body, and advanced AI. The bear has 13 touch sensors and runs image recognition software to recognize people. Like Paro the robot seal, it's designed to interact with children, elderly, and infirm people, though one can imagine it could become a robot toy like the dinosaur robot Pleo or Sony's Aibo dog robot. It can move its head and paws, track people's faces, laugh, cry, and sneeze. The bear is a concept product and Fujitsu hasn't announced any plans to sell it.


• Developed by Orbotix of Boulder, Colo., Sphero is a robotic ball that you can control with an iPhone or iPad via Bluetooth. Slide your finger on a circular control to move the ball and you can play office golf or challenge a friend for a game of sumo ball. The Sphero balls change color but don't have cameras or other sensors. Some people may argue this is just a remote controlled toy, not a robot, but Orbotix hopes that by providing an easy to use open API, app developers can add new capabilities to the ball bot. No details on price and availability, except that it should cost less than $100 and hit the market later this year.


• Murata Boy, developed by Murata Manufacturing Co., is a little humanoid robot that rides a bicycle. It made an appearance at CES along with a new companion: Murata Girl, which rides a unicycle and blushes and nods her head. Both robots can balance in place or even ride along a narrow beam. Show demonstrators controlled them by waving specially designed wands.


• The Vgo robot, created by Vgo Communications in Nashua, N.H., allows remote workers to not only see, hear, and talk but also move around and collaborate more effectively with colleagues. Unveiled last June, the telepresence robot sells for  $5,000 plus a service contract -- an attractive price compared to competitors such as the Anybots QB, which costs $15,000. The Vgo robot is rather short (1.2 meter, or 4 feet, tall), and one wonders how it feels to embody them. The company says that executives at Palantir Health and Orbitz have been using the robot to improve collaboration and reduce travel across multiple offices.


• Finally, my favorite robot demo was when Japanese company Cyberdyne allowed tech journalist Evan Ackerman to try out its robot suit HAL. It's not everyday you get a chance to step into a robotic exoskeleton that can sense when you want to move your legs and move them for you! Designed to assist the elderly and disabled to regain more mobility, the HAL suit is available to hospitals and clinics in Japan and rents for about $1,500 per month. Ackerman became the first person in the United States to try the legs -- and he liked them.

For more gadget news, check out our complete coverage of the 2011 Consumer Electronics Show.


Top 20 Robot Videos of 2010
Tue, January 11, 2011

Blog Post: Quadrotors performing acrobatics, humanoids dancing, dexterous robots folding towels, and more

Holiday Season Robot Videos
Fri, December 24, 2010

Blog Post: This flying robot wishes you a happy holiday season -- and it will even play a song for you

How to Build Your Own UAV
Fri, October 29, 2010

Blog Post: The Robots Podcast interviews DIY Drones founder and Wired editor Chris Anderson

Robotic Drone Flies Itself
Sun, December 12, 2010

Blog Post: This UAV identifies visual cues on the ground and steers itself autonomously

Top 20 Robot Videos of 2010

Last year was an incredible time for robotics, and to recap the best robot moments of 2010 we decided to compile a list of our favorite videos. Check out below our selection -- going from No. 20 to the No. 1 -- and let us know what you think.


No. 20 Let's start off with the musculoskeletal humanoid Kojiro, built at the JSK Robotics Lab in Tokyo. With a body that mimics the way our skeletons and muscles work, it's surely one of the coolest -- and strangest -- robots of 2010. 


No. 19 Last year, the Stanford Racing Team showed one of the most extreme stunts a robotic car has ever pulled off: They taught their Junior vehicle to accelerate in reverse, then suddenly hit the brakes, turn the wheel, and start a 180-degree skid, ending up right in a desired parking spot. It's all for research!


No. 18 Built by University of Tokyo researchers, the Athlete robot uses artificial muscles to run like a human. Sort of. So far it can only perform a short dash, but hey, we're cheering for it. Run, Athlete robot, run.


No. 17 Among last year's robotics milestones is the emergence of commercial  telepresence robots. And Silicon Valley startup Anybots was probably first to hit the market with its skinny alien-looking QB robot, which made the future seem a little bit closer by allowing people to roam around embodied as robotic avatars.


No. 16 Another weird one, the Telenoid R1, created by Japanese roboticist Hiroshi Ishiguro, is a telepresence robot with the body of a fetus, sperm, or Casper the Friendly Ghost, depending on who you ask.


No. 15 Boston Dynamics, famous for its BigDog quadruped, is also building a biped robot, called Petman. Last year, the company showed that the robot could run at 4.4 miles per hour (about 7 kilometers per hour). Can I get a pair of legs like that?


No. 14 It's always fun to see Honda's Asimo doing its thing. Last year, the astronaut-looking humanoid made an appearance at Ars Electronica in Linz, Austria, where it perform some old tricks as well as some new ones. You funny, Asimo.


No. 13 Speaking of Asimo, Iran seems to be a fan. Engineers at Tehran University built an adult-size humanoid called Surena. The robot can walk, stand on one foot, and even perform a little dance. It also loves to be on TV.


No. 12 Working to develop brain-machine interfaces to help paralyzed people, scientists at the University of Pittsburgh achieved an extraordinary result last year: They taught a monkey to control a 7-degrees-of-freedom robotic arm. Did we mention the monkey was using only its mind?


No. 11 Some engineers know how to have fun. At the Max Planck Institute for Biological Cybernetics, in Tübingen, Germany, researchers decided to combine a massive KUKA manipulator arm and a Formula 1 video game. The result is the most awesome racing car simulator ever built.


No. 10 The Honda U3-X personal mobility device is not exactly a robot, but this amazing unicycle does use balancing technology from Asimo -- and it definitely comes from the future.


No. 9 It's hard to believe that researchers were able to make a swarm of bacteria build a tiny pyramid. We have news for you: now they want to use this type of bacteria to power microscopic robots -- inside your body!


No. 8 Though the U.S. military routinely uses robots to disarm bombs in Iraq and Afghanistan, a video emerged last year showing a robot launching a bomb. The weaponized iRobot PackBot was capable of launching an explosive rope to eliminate obstacles and clear roads. That's an explosive video.


No. 7 Hiroshi Ishiguro has built some scary robots (including a copy of himself and the aforementioned Telenoid). But Geminoid F is another story. This ultrarealistic humanoid is a copy of a young Japanese woman and it can talk, move its eyes and head, frown, and smile. In fact, she even got a job.


No. 6 Quadrotors have been gaining popularity, and last year several groups demonstrated some impressive results. One group in particular, the GRASP Lab at the University of Pennsylvania, stood out for its acrobatics, with its machines flying and looping through obstacles and even landing on vertical surfaces.


No. 5 Is this the most amazing adult-size humanoid ever built? Possibly. AIST's HRP-4 is sleek, athletic, and graceful, and few, if any, robots can move like it does. As one observer put it, it "will make you bow in deference."


No. 4 Robots with knives, robots with knives! This was one of our favorites stories of 2010. To study what kind of injury a knife-wielding robot might cause, German engineers built -- what else? -- a knife-wielding robot and set it loose on a pig carcass. But they didn't stop there. After devising a collision-detection system, the researchers were confident enough to test it on their own flesh.


No. 3 We're down to the final three videos, which feature very different robots but have one thing in common: They all captured people's imagination. First up is the humanoid HRP-4C, which last year showed off its dance moves along with a troupe of real human dancers in a video that went viral on the Net.


No. 2 In another popular story of 2010, U.C. Berkeley researchers programmed a PR2, an advanced robot developed by Willow Garage, to fold towels. Video of the robot neatly folding towel after towel was seen by tens of thousands of people soon after it was released. Which proves that people really hate folding towels. The PR2 was also responsible for several other cool videos in 2010 -- it could have its own top 20 videos list! -- like this one of the bot playing pool


And finally...

No. 1 Many of the robots above are extremely sophisticated and expensive systems and they are capable of performing formidable things. But sometimes simplicity and beauty win. Below is our choice for the No. 1 video of 2010. It shows a robot that balances on a ball. Beautiful.


What do you think of our list? Do you disagree with any of our choices? Think we forgot something? Let us know.


Robot Suit HAL Demo at CES 2011
Sun, January 09, 2011

Blog Post: A man turns into a cyborg at the Consumer Electronics Show

Is Telepresence the Next Big Thing?
Tue, September 07, 2010

Blog Post: Will telepresence robots revolutionize work, manufacturing, and other facets of modern life?

Nao Robot Does Star Wars
Tue, January 04, 2011

Blog Post: Video of the humanoid robot Nao performing its hilarious "Star Wars" act

WheeMe Massage Robot
Wed, November 24, 2010

Blog Post: Want a back rub?

Robot Suit HAL Demo at CES 2011

cyberdyne hal robot suit

A man turned into a cyborg yesterday at the Consumer Electronics Show in Las Vegas when he stepped into a powered robot suit that moved in response to nerve signals in his legs.

Technology journalist Evan Ackerman became the first person in the United States to test the robotic exoskeleton Hybrid Assistive Limb, or HAL, created by Japanese company Cyberdyne [see photo above].

Several companies and labs in the United States and Japan are developing robot suits to help disable and elderly people regain more mobility -- or to give soldiers super human strength.

Professor Yoshiyuki Sankai, head of the Cybernics Lab at the University of Tsukuba, founded Cyberdyne to commercialize the suit, which he started developing more than a decade ago.

Cyberdyne is conducting several patient trials in Japan, where it also rents its device to hospitals and clinics for about U.S. $1500 per month. The company also said it was contacted by the U.S. military, which is apparently interested in testing the exoskeleton.

The HAL suit, which weights 10 kilograms, consists of a lightweight frame that straps to the body. Its electric motors act as artificial muscles that provide powered assistance to the wearer's limbs.

Cyberdyne has publicly demonstrated its exoskeleton several times but previously only the company's engineers or patients were allowed to wear the robot suit.

At yesterday's demo, Ackerman got a taste of the future by becoming a man-machine hybrid. Though the tried just the robot legs (the company also makes a full suit includes powered arms), he said the experience was "incredible."


To use the suit, Cyberdyne employee Takatoshi Kuno first attached sensors to Ackerman's legs. The sensors monitor the electrical activity of nerves to control the suit's dc motors.

The suit works on intent: the user needs only to "think" of moving his or her legs -- the suit does the rest. That's because the brain sends signals to the muscles of the legs, and the sensors detect them.

"Once I figured out how to stop trying to walk in the suit and just let the suit walk for me, the experience was almost transparent," Ackerman said.

The suit includes a pouch with a computer, Wi-Fi card, and battery, and it sends data about its operation to a remote PC. Cyberdyne's Kuno said he set the suit on "level 1," because Ackerman's legs had normal strength; for people with weaker muscles, the suit could go to level 4.

Ackerman walked around the room and also climbed stairs to go up and down the stage. At first he appeared to struggle to move its legs, but after just a few minutes he was feeling comfortable in his new robot body.

"I didn't try to kick anything to pieces Iron Man style," Ackerman said, "but going up stairs was definitely all the suit doing the work and not me."

Photos: Joe Calamia/IEEE Spectrum

For more gadget news, check out our complete coverage of the 2011 Consumer Electronics Show.



IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:

Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
Jason Falconer
Angelica Lim
Tokyo, Japan

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Load More