It turns out that studying how to make robots grasp objects with their hands is helping researchers figure out how to make robots balance on their feet.
Christian Ott and his team at the German Aerospace Center's Institute of Robotics and Mechatronics have discovered a way to keep bipedal robots from falling over by using principles from robot grasping.
Rescue robots don't always have to be big and burly and complicated. Usually, if you put something big and burly and complicated in an environment with lots of water and dust, all the big and burly complicated bits get decidedly less complicated by virtue of ceasing to function. You can seal up individual parts (like wheels or tracks) as best you can, but sealing up the entire robot offers even more durability. The SCV (Slug Crawler Vehicle) from the Chiba Institute of Technology in Japan relies on a flexible, waterproof "skin" to protect it from the elements while still allowing it to get around pretty well:
Healthcare and elder care is a big concern in Japan, whose population is aging more rapidly than their current human-centric infrastructure is prepared to cope with. Companies like Toyota are hoping that robots will be able to pick up a little bit of the slack, and this week they've introduced four new robotic systems designed to help keep people healthy and independent as long as possible.
The first couple systems are designed to provide single-leg walking assistance to people who have balance issues, or even people suffering from complete paralysis in one leg. The robotic structure (it's a lot like Cyberdyne's exoskeleton) is capable of supporting the entirety of your weight on one leg, and it will swing your leg forward for you as you walk. If you can hold yourself up, the second system will provide you with visual feedback to help you get your balance back and start walking on your own.
If that's not exciting enough for you, the third system turns balance training into a game. You can play virtual games of tennis, football, or basketball, and you'll be challenged to maintain your balance while controlling your character on the screen:
The final system is more for caretakers than patients; it's a robot that helps someone transfer someone else from (say) a bed to (say) a toilet. And, well, there's a demo of that, too:
As you can see, all of these prototypes are currently operational, and Toyota is expecting commercialization to occur sometime in 2013.
Naughty robots can now be tamed with this snazzy smile-detecting device from the University of Tsukuba AI Lab. Anna Gruebler and her colleagues have developed a wireless headband that captures electromyographic (EMG) signals from the side of the face, detecting when you're smiling with delight or frowning with disapproval.
Unlike cameras with smile-detection algorithms, this device can work in low light, while you're walking around, and when you're not looking into your computer's camera. Part of the charm, the researchers say, comes from the discreet headband design that beats traditional face electrodes and wires.
Last year, Gruebler proposed the device to control avatars on Second Life in a hands-free way, as in the explanation video below. More users would approach her avatar, she says, because it was smiling and looked friendly.
The trainer tries to teach the robot her preference: Give the ball or throw it. Although the Nao starts out slow and hesitant, it speeds up after acquiring experience and feedback from the trainer. Their study compared it to using a manual interface: While users made mistakes using a dial, they never confused smiling and frowning -- a natural, intuitive way to interact with a robot.
The main idea, the researchers say, is that it's similar to how parents teach and encourage babies.
The next step is to apply the device to other real-life situations. If you could train a robot with a smile or frown, what would you have it do?
Angelica Lim is a graduate student at the Okuno and Ogata Speech Media Processing Group at Kyoto University, Japan.
Yeah, so this right here is a giant robotic spider. By "giant" I mean that those legs are 20 centimeters long each, and if the body adds another 20 centimeters, we're looking at a robot arachnid that's a terrifying two feet across (0.6 meters). For what it's worth, this is approximately twice the size of the largest real spider, the Goliath bird-eater, and the Goliath bird-eater doesn't even jump.
Oh yes, this robot jumps.
The neat thing about spiders (if you're into spiders, anyway), is that they're hydraulically operated. Instead of moving their limbs with muscles, they do it by increasing the blood pressure in whatever limb they want to extend. Hydraulically operated robots work the same way, except they have a hydraulic pump instead of a heart and hydraulic fluid instead of blood. This can be a very effective way of providing power to limbs, which is why Boston Dynamics uses a hydraulic system in AlphaDog and PETMAN.
Anyway, back to this freaky thing. Designed by a team at the Fraunhofer Institute for Manufacturing Engineering and Automation in Germany, this prototype robospider was 3D printed, meaning that more of them than I would personally be comfortable with can be manufactured quickly and cheaply. A hydraulic pump in the body provides fluid pressure to the limbs allowing the robot to crawl forwards and backwards, and some versions are apparently powerful enough to leap off the ground, grab you by the throat, and rip your head off. Or maybe not that last bit. Maybe.
In any case, having eight legs makes the robot exceptionally nimble, which is the whole reason for utilizing this design. The body of the spiderbot also contains the control system and a variety of sensors to enable it to perform its primary mission, which is as "an exploratory tool in environments that are too hazardous for humans." Like, I dunno, environments that are full of giant spiders?
This is not the first sticky-treaded robotank, but as far as I know, it's the first one that can manage to go around corners and make that tricky transition from horizontal to vertical. The somewhat unfortunately named "Tailless Timing Belt Climbing Platform" (or TBCP-11) comes from Simon Frasier University way up there in Canada. It weighs 240 grams, and has no problems climbing up whiteboards, glass, and other slick surfaces.
The sticking power of those treads comes from the same handy little Van der Waals forces that geckos use to effortlessly stick to, well, everything. Instead of tiny hairs, though, TBCP-11 uses tiny mushrooms, which provide a substantial amount of conformable surface area for the robot to use to adhere to walls.
Maximizing compliant surface area has been an issue for gecko-type (aka dry-adhesion) climbing robots for a long time; the material itself is spectacular, but the tough part is getting enough of the material to make contact with your climbing surface. For example, check out the picture of Stickybot III's toes in this article, and notice how little of the adhesive the robot is relying on to stick. This is one of the advantages of the TBCP-11: the continuous loops of adhesive material provide a lot of adhesion power.
While this robot does have some autonomous capability, it's still tethered for power, since batteries are heavy. It's going to take a little extra work to increase the strength of the adhesive so that the TBCP-11 can bring its power source onboard, and the SFU researchers are also trying to figure out how to get the thing to turn without the treads coming loose and causing the TBCP-11 to plummet to its doom.
But here's the problem: That's just part of the story. In fact, a small part. If we want to achieve a true robotics revolution, the reality is that the robots I mention above and others that the press likes to cover are not going to be enough. We need robots that can do everyday jobs, performing basic tasks over and over, safely and reliably. In other words, we need robots that will become so enmeshed in our lives that people stop paying attention to them: They will be ... boring.
I find that the tech press and people in general are not so inclined to become “excited” about boring robots. They should.
To make an analogy, consider the air travel industry. Today, the press no longer cares for successful round trips or tout the “miracle” of flying; instead, flying is an everyday routine that helps millions of ordinary people build business relationships, visit family and friends, journey around the world. What was once considered the epitome of human dreams and desires is now a commoditized, uninteresting service that people take for granted, with the only things we care about being paying less and avoiding hassle.
I won't say I didn't wish flying was a better experience. But the great thing about what happened to air travel is that it became accessible to millions of people. And that's the true commercial flight revolution. What about a true commercial robotics revolution? To get there what we need is for robots to become as routine and uninteresting as passenger flight has become in the past century.
Robots today are like the first airplanes. They are as remarkable a technical endeavor as flight once was, and current demonstrations are as entertaining (and unproductive) as the first airplane stunts once were: They're great to watch, but true global change lies in the hands of real products that are safe, affordable, and -- that's right -- boring.
Let me shamelessly plug my own employer here. I work for Adept Technology, based in Pleasanton, Calif., the biggest U.S. industrial robotics company. Some of our robots have been featuredin thetech press, because, yes, they make for cool videos. But let me introduce you to a “very boring” autonomous mobile robot platform called the Adept MT400 [photo above].
This is a small mobile vehicle designed for human environments. Basically it just roams around, commanded through push buttons, sensors, tablets, or smartphones, while avoiding bumping into people and objects. We're offering this little guy for third-party developers, including end-users, integrators, researchers, and entrepreneurs, who are interested in developing applications for it.
Back to the air travel analogy, we're like an aircraft manufacturer looking for an airline -- a partner to build a service that can attract customers and become a promising business.
The MT400 is just one example, of course. What we need from robotics companies and roboticists everywhere are more boring robots: Robots that would be most appreciated when they complete a task in a manner that is smooth and economical; robots that investors and companies can trust building business models around.
What will the future look like when, similar to flying, robots become boring enough for companies to create services that help millions and millions of people? I don't know for sure, but I do know it won't be boring.
Agree? Disagree? Let me know.
Erin Rapacki is a product marketing manager at Adept Technology. She lives in the San Francisco Bay Area.
Photo illustration: Street crossing photo: neovain via Flickr; robot photo: Adept Technology. Photo of MT400: Adept Technology
Man, just when we were getting close to making actual robotinsects, some thoughtless researchers had to go and invent a robotic insect-eating plant. Sigh. The artificial venus flytrap in the pic (which can apparently be abbreviated "VFT") is a creation of Mohsen Shahinpoor from the University of Maine.
Like a real VFT, this artificial plant has an intelligence of sorts, in the form of ionic polymeric metal composite trigger hairs on the inside of its polymer leaves. When something (like a tasty insect) touches on one of the hairs, a copper electrode triggers the leaves to snap shut in 0.3 second, and a series of teeth interlock to keep whatever the robot has caught from escaping.
For now, this robot flytrap only snacks on the old-fashioned biological sorts of flies. It also doesn't currently have the infrastructure required to turn said flies into robot food, but that's just a matter of hooking up a microbial digester like this one to turn bugs into robot fuel.
If you've ever gotten stuck in a t-shirt, this robot is for you.
For most people, putting on a t-shirt isn't a chore, but researchers at Nara Institute of Science and Technology in Japan have identified this important task as particularly difficult for the elderly or disabled with limited arm movement.
A cross-laboratory team led by Tomohiro Shibata and Takamitsu Matsubara developed a two-arm robot to slide a shirt over and onto a person's head and torso. Since a person's neck or arms may not be in the exact same position each time, a scripted movement could potentially cause distress.
Enter the team's reinforcement learning approach. Just like a child learning through experience, the robot is taught once how to clothe a human user, and then is given several attempts to put the shirt on by itself. The success is measured with motion capture system at the end of each trial, which lasts about 10 seconds.
In the video, we see that after three learning trials, the robot has learned the trajectory to place the shirt on without any trouble. According to Shibata, Japanese reporters who tested out the system gave it a thumbs up, saying it makes it easy to put on.
For now, the system has been tested with only a couple of different t-shirts and with several subjects including a few patients. The next step, says Shibata, is to try out the system with more subjects and patients, and with different t-shirts. "Our approach could be applied to other types of important clothing tasks such as pulling up/down pants."
PETMAN is an adult-sized humanoid robot developed by Boston Dynamics, the robotics firm best known for the BigDog quadruped.
Today, the company is unveiling footage of the robot's latest capabilities. It's stunning.
The humanoid, which will certainly be compared to the Terminator Series 800 model, can perform various movements and maintain its balance much like a real person.
Boston Dynamics is building PETMAN, short for Protection Ensemble Test Mannequin, for the U.S. Army, which plans to use the robot to test chemical suits and other protective gear used by troops. It has to be capable of moving just like a soldier -- walking, running, bending, reaching, army crawling -- to test the suit's durability in a full range of motion.
Marc Raibert, the founder and president of Boston Dynamics, tells me that the biggest challenge was to engineer the robot, which uses a hydraulic actuation system, to have the approximate size of a person. "There was a great deal of mechanical design we had to do to get everything to fit," he says.
As I said before, this is the first time I see a machine performing movements like that -- remarkably human, yet uncanny valley-esque at the same time.
Led by Dr. Robert Playter, Boston Dynamics' VP of engineering, development of PETMAN got its start with a $26.3 million Army program. Two years ago, the company, based in Waltham, Mass., first demonstrated PETMAN's legs by putting them to walk on a treadmill. This year, the company showed that the robot legs canrun at up to 7 kilometers per hour (about 4.4 miles per hour) and announced it had completed a prototype of the body.
But until now, the extent of PETMAN's full capabilities was a mystery.
Raibert says the humanoid and its behavior are still under development. "We plan to deliver the robot to the Army next year."
According to the Army requirements, the robot has to have about the same weight and dimensions of a 50th percentile male (the size of a standard crash-test dummy), or a mass of 80 kilograms (about 180 pounds) and height of about 1.75 meters (nearly 6 feet). PETMAN also has to simulate respiration, sweating, and changes in skin temperature based on the amount of physical exertion. Boston Dynamics used motion-capture systems to study the movements of humans as they performed a variety of exercises.
The robot relies on a tether that provides hydraulic power, but its body had to share space with many sensors and other components. Cramming everything together became a big engineering puzzle. And not only the legs had to be strong, Raibert explains, but the upper body too, to allow the robot to crawl and stand up.
And I know some of you are wondering: Will it have a head? "We were a bit late getting the articulated neck mechanism working," he says, "but it is coming along, and a head along with it."
I also asked Raibert if they could eventually use PETMAN or PETMAN-related technologies in other projects. In other words, are we going to see PETMAN used in applications other than the chemical suit tests?
"You bet," he says. "There are all sorts of things robots like PETMAN could be used for. Any place that has been designed for human access, mobility, or manipulation skills. Places like the Fukushima reactors could be accessed by PETMAN-like robots (or AlphaDogs), without requiring any human exposure to hazardous materials. Perhaps firefighting inside of buildings or facilities designed for human access, like on board ships designed for human crews."
This, of course, will mean another big challenge for his team: Transforming the humanoid from a tethered system into a free standing, self-contained robot. Boston Dynamics, however, has already demonstrated its ability to transition to tether-less machines with its BigDog project.
One question remains unanswered, though: Will BigDog become PETMAN's best friend?