We've seen robots use a staggering variety of different techniques to climb things, and some of the most elegant (if not necessarily the most successful) are inspired by biology. Stanford's Stickybot is a good example of this, using nanoscale adhesive pads modeled on gecko feet to cling to smooth surfaces. But there are other animals that can stick to things even better than geckos can: our friends (and occasional enemies), the insects.
Insects climb in a couple different ways. On rough surfaces, they usually rely on small claws (kinda like Spinybot), but on smooth surfaces, some insects secrete an oily fluid to help turn pads on their feet into little suction cups of a sort. Minghe Li, a roboticist at Tongji University in Shanghai, has created a climbing robot that mimics this capability using pliable silicon feet that squirt out a mixture of honey and water onto the climbing surface.
It only takes a very little bit of liquid for the feet to stick, and while the robot currently can't climb slopes past 75 degrees, this method may ultimately prove to be as effective as the gecko-type sticky foot on smooth surfaces and more effective on rough or wet surfaces, which the gecko adhesive has trouble with. Also, making artificial gecko feet is tricky and expensive, while making honey just involves being nice to bees.
Li is currently adapting his robot's feet to better emulate those of insects to try to improve its climbing effectiveness. He's also, I imagine, looking for a way to clean up the sticky little footprints that are undoubtedly all over his lab.
UPDATE: I've obtained more photos of Elfoid -- see below.
Elfoid, a hybrid of cellphone and robot, transmits voice and motion to convey a person's "presence."
A pocket-size android shaped like a fetus might be your next cellphone.
Meet Elfoid, a miniature anthropomorphic robot unveiled today in Japan that works like a cellphone but is designed to transmit not only voice but also "human presence."
That's right. Next time you call your friends, you might be uploading yourself into this fetus body right into their pockets -- and their hands. Can you feel me now?
The idea is you use a motion-capture system to transmit your face and head movements to the Elfoid, which would reproduce them, plus your voice, on its own little body, thereby conveying your presence.
Last August, Ishiguro and his colleagues at the Advanced Telecommunications Research Institute International, known as ATR, where he's a visiting group leader, unveiled the Telenoid [photo right], an infant-size telepresence android that resembles, depending on whom you ask, Casper the Friendly Ghost, an overgrown sperm, or a developing fetus. Talk about embryonic technology.
Now the Japanese researchers have shrunk the Telenoid into a little robot elf you can carry in your pocket. The Elfoid P1, introduced today at a press conference in Tokyo, combines the robotic technology of the Telenoid with cellphone capability, allowing people to interact in a way that they can "feel each other's presence," according to Ishiguro. It seems the Elfoid can't move its face and limbs as the Telenoid does, but the researchers say they're planning to use microactuators to improve the device's movements.
His team received technical support from Qualcomm Japan to use a 3G communication unit on the android, and NTT Docomo assisted the researchers in testing the device.
Ishiguro says cellphones are constantly improving, and smartphones showed they can have superb interface designs, but one thing has remained the same: Voice still plays a big role in how we communicate, and voice has limitations.
Ishiguro says the human body -- capable of displaying and recognizing subtle cues and gestures -- is the most effective and natural interface for communication, so trying to use androids to capture these advantages makes sense. With the Elfoid, the researchers want to create "an innovative communication medium" capable of conveying human presence to remote locations using voice, appearance, motion, and touch (the Elfoid has a "soft, pleasant-to-the-touch exterior," they say).
And why the strange fetus-like looks? They explain that they sought a minimum design that could be recognized as male or female, old or young, and that users would use their imagination to make the robot more personal.
Any early adopters? Are you ready to "Elfoid" your friends?
Below, a video and more images:
Images: Osaka University and Advanced Telecommunications Research Institute International
The furry creature above? No, it's not Paro, the Japanese therapeutic robot seal. This hoop-shooting mechatronic harp seal is a creation of Taiwanese roboticists at the Industrial Technology Research Institute. The thing is not really a fully-actuated robotic animal; it's more of a manipulator arm disguised as a stuffed plush seal, with its multi-fingered gripper freakishly sticking out of the creature's mouth, for added Uncanny Valley-esque creepiness.
The researchers claim their robot can convert hoops 99 percent of the time, but keep in mind it's shooting a toy basketball at close range (the maximum distance in the experiment was 3 meters). Still, watching this bizarre bot in action is utterly entertaining.
Jwu-Sheng Hu and colleagues described their robot at the IEEE International Conference on Intelligent Robots and Systems, last October, where they presented the paper "A Ball-Throwing Robot with Visual Feedback."
It looks like a simple stunt, but there's a good deal of tech behind it. The robot uses a stereo vision system to compute the position of the hoop in three-dimensional space. Based on that position, an algorithm determines the angle and speed the robot needs to launch the ball to hit the target. After some calibration procedures, the robotic arm does the rest. Watch out, LeBron!
UPDATE: I incorrectly attributed this work to researchers at National Chiao-Tung University. The robot was actually built at Taiwan's Industrial Technology Research Institute. National Chiao-Tung University collaborated in the project.
Image and video: Industrial Technology Research Institute
Computer model of Boston Dynamics' Cheetah robot galloping.
Boston Dynamics, best known for its BigDog bionic beast and other agile machines, is developing two new robots: one will be a super fast quadruped called Cheetah, which obviously should've been named BigCat; the other is a beautifully intricate, freakishly scary full-size humanoid called T-800 Atlas.
The Cheetah robot will have a flexible spine, an articulated head and neck, and possibly a tail. Think of BigDog, bur rather than a robot mule, Cheetah will be able to accelerate rapidly and make tight turns so it can "chase or evade," the company said in a statement.
In fact, Boston Dynamics says Cheetah will sprint "faster than any existing legged robot and faster than the fastest human runners." That's a bold claim. But seeing what the company has demonstratedwith BigDog, we're excited to see this cybernetic cat stepping out of their lab.
The second robot Boston Dynamics is building, the humanoid Atlas, will have a torso, two arms and two legs, and will be capable of climbing and maneuvering in rough terrain. The robot will "sometimes walk upright as a biped, sometimes turning sideways to squeeze through narrow passages," and sometimes crawl, using its hands for extra support and balance. (I don't suppose they'll run the Cheetah running algorithms on it, or will they?)
Atlas, a new humanoid robot that Boston Dynamics is developing, will rely on hardware built for another of the company's robots, Petman, shown here during initial assembly and testing.
Atlas will be based, in part, on Petman, an anthropomorphic robot Boston Dynamics developed for the U.S. Army. Until recently, only the robot's legshad been made public, but now the company has unveiled its full (well, headless) body [see photo above].
Atlas will be different from existing humanoids that use static techniques to control their movements, relying instead on a dynamic control approach, the company said. "Unlike Honda’s Asimo and most other humanoid robots you’ve seen, Atlas will walk like a man, using a heel-to-toe walking motion, long strides and dynamic transfer of weight on each step," said Rob Playter, the Atlas principal investigator and vice president of engineering at Boston Dynamics.
Another bold claim. So far we've seen that Boston Dynamics can make its Petman humanoid run -- and fast. But I still want to see it walking around in human spaces, negotiating obstacles on the floor, or keeping its balance when someone pokes it on the chest -- things that other humanoids, including, yes, Asimo, have demonstrated a long time ago.
The company says both robots will stand out for their use of dynamic agility, throwing or swinging their legs and arms to maintain balance and overcome obstacles. “For these programs to succeed we must develop robot hardware and software with the speed, flexibility and strength of athletes, and a more fundamental understanding of how legs work” said Marc Raibert, lead investigator of the Cheetah program and president of Boston Dynamics.
Boston Dynamics, based in Waltham, Mass., will develop the two robots as part of new contracts that the U.S. Defense Advanced Research Projects Agency (DARPA) has awarded the company.
The company says that, in addition to military applications, Cheetah and Atlas could find uses in emergency response, firefighting, advanced agriculture, and vehicular travel in places that are inaccessible to conventional wheeled and tracked vehicles.
Marathon runners require long hours of training, plenty of water, and an iron will. In the world's first bipedal robot marathon, the key ingredients seemed to be line-tracking algorithms, batteries, and lots of compressed air coolant.
The 42.195-kilometer race (the length of a real marathon) took place in Osaka, and a little humanoid robot called Robovie-PC was the big champion. It crossed the finish line on Saturday, after a grueling 54 hours, 57 minutes and 50.26 seconds -- more than two days running non-stop on the track. Only 1.73 seconds later, another contestant, Robovie-PC Lite, completed the race. The robot naming isn't a coincidence: The two robots were the submissions of Vstone robotics company, which organized the event with the city of Osaka.
It was an exciting ending. Watch:
What makes a winning robot? Team Vstone used line-tracking to navigate the track, taking advantage of the rule allowing autonomous navigation. The other four teams, including two student teams from Osaka Institute of Technology, patiently controlled their bots using game controller-like remotes. A dedicated human presence was also necessary to support the "runners": When batteries ran low, teams rushed in to swap them for fresh ones. Periodically, teams also needed to spray overheating motors with cans of cool, compressed air. Falling, however, was not a problem -- all robots had to be designed with an automatic "getting up" feature.
At an average speed of 0.7 km/h, the robots were about as exciting as watching a tortoise cross the Sahara. However, these endurance races highlight the requirements for long-running, autonomous robots. Robots -- that don't have their own dedicated pit crew -- need autonomous navigation, automatic recharging, and low-maintenance actuators. The bipedal aspect was also important; stairs and raised sidewalks are constant reminders that our world is designed for two-legged humans.
The Japanese government is aiming for robots to take care of their nation's aging population. But if we want robots to take care of us, instead of the other way around, we'll first need to see a robot marathon where no human intervention is required. So what we saw here were the first steps -- literally.
Image: Vstone Co.
Angelica Lim is a graduate student at the Okuno and Ogata Speech Media Processing Group at Kyoto University, Japan.
Need to destroy something? Get a F16. No, not thatF16. The F16 demolition robot from Stanley Hydraulic Tools. Unveiled this month, this electrically-driven hydraulic monster comes with five different attachments: shear, breaker, grapple, drop hammer, and our favorite, a concrete-cracking claw. Sure, it's more of a remote-controlled shrunk excavator than a robot. But who cares? It can tear down walls and cut steel like butter. Can we bring this guy to RoboGames?
GPS is generally the standard to which all other localization technologies are compared, and in most outdoor environments, it's hard to beat for accuracy, precision, and reliability. It's funny, then, that the times you need GPS the most (in places like downtown New York or San Francisco) usually end up being the times that GPS utterly fails due to tall buildings blocking out the sky.
Robots have this same problem, so researchers have been trying to find other ways that a robot can localize itself where GPS is intermittent. A common strategy is to use wheel odometers or inertial measurement units to "guess" where the robot has gotten to since its last external position fix, but accuracy still relies on landmarks to help correct for errors. So let's see, what are some things that are easy to find in urban environments and don't go anywhere? You probably weren't thinking "manhole covers," but yeah, it's manhole covers.
The reason manhole covers might be a good idea is because they have well-defined locations and because they're fairly large and made of metal they're pretty easy for a robot to spot, even if it's dark or rainy or snowy or whatever. With sensitive enough sensors, a robot could detect all the dents and wear that make each cover unique, and fix its position within inches. Sounds great, and it works in tests along with a database of pre-scanned manhole covers, except that generally, you tend to find manhole covers in the middle of the street, implying that a system like this would be best for autonomous cars as opposed to other robots that might not do as well running into traffic to determine their position.
And, yes, that's a picture of a manhole robot. That shoots manholes. Thank you, Mighty Morphin Power Rangers.
PR2 robot with Automaton t-shirt. Robots love Automaton. Automaton loves robots.
Good news, everyone! We're thrilled to report that Automaton is a finalist in this year's National Magazine Awards for Digital Media, aka the Digital Ellies. The Ellies are the magazine industry's answer to the Oscars -- but with more scruffy people with unstylish hair.
Automaton is in good company, with Salon, Sports Illustrated, Sunset, and Tablet as the other finalists in the blogging category. This means that robotics as a subject is competing with politics, sports, and food, which is what the other blogs cover. Thankfully, SI doesn't have a blog on swimsuit models!
Last year, our Robots for Real show was a finalist in the podcast category, so this marks the second year in a row that IEEE Spectrum has been recognized with a Digital Ellie nomination. The 2011 winners will be announced at the Digital Ellies ceremony in New York City on March 16. Fingers, human and robotic, crossed!
I just want to say thanks to you, our readers, and to all roboticists and their robots for keeping us inspired about the possibilities of science and engineering.
Need someone to zip up your dress? In 1964, the Hughes Mobot was there to help.
Hughes Aircraft's Mobot, aka Mobot the Magnificent Monster (seriously), was originally designed for the Atomic Energy Division of the Phillips Petroleum Company in the late 1950s and early 1960s as a remote manipulator.
A 150-meter [500-foot] cable led back to a control console, where a human operator could safely direct remote cleanup operations of radioactive material and other nasty stuff. Mobot had two manipulator arms along with two cameras, which are the things that look vaguely like water-cooled machine guns but aren't. Sad.
Apparently, this degree of usefulness wasn't good enough for Life magazine, which decided that Mobot (and its delicate touch) would be better off helping women put on makeup and get dressed. Or is it undressed? Feel free to use your imagination on that one.
Tomorrow is a huge day for robotkind. If all goes as planned, at 4:50 p.m. EST, the space shuttle Discovery will blast off from Cape Canaveral, Florida, carrying aboard a crew of astronauts and also NASA's Robonaut 2, which will become the first humanoid robot in space.
The shuttle's destination is the International Space Station, ISS, where Robonaut 2 will become a permanent resident and work alongside humans as a robotic helper. Astronauts will mount the robot on a fixed pedestal inside one of the ISS labs and use it to perform tasks like flipping switches and holding tools.
So no, Robonaut won't be fixing meals for the human crew. The main goal is to find out how manipulation robots behave in space -- and also give crew members a second pair of hands. NASA hopes the experience will allow it to upgrade the robot in the future, so it would be able to support astronauts in more complex tasks, including repairs and scientific missions outside the ISS.
The robot can perform tasks autonomously or under remote control, or a mix of both, Nic Radford, the Robonaut deputy project manager, told us. Astronauts on the station will operate the robot using a laptop, he said, though it can also be "joysticked" and directly controlled from Earth, with a few seconds of delay.
Sending Robonaut to space is a great feat for NASA, but it raises the question: Is this another step in using robots to replace humans in space exploration? In my opinion, using teleoperated and semi-autonomous robots makes a lot of sense. Robotic explorers have already demonstrated that unmanned missions offer formidable rewards, with immensely smaller costs and risks than manned ones. Of course, NASA enjoys cheering for its robots, but it's quick to point out that robots are not a replacement for humans in space, but rather "companions that can carry out key supporting roles."
That might be the case for now, as robots still can't match human manipulation and other capabilities. But robots are catching up fast. One of Robonaut 2's key features is its dexterous, humanlike arms and hands. Each arm is about 80 cm [31 in] long and can hold 9 kg [20 lb] in Earth's gravity. Each hand has 12 degrees of freedom: 4 DOFs in the thumb, 3 DOFs in both the index and middle fingers, and 1 DOF in the other fingers. The fingers are articulated and driven by tendons, just like human hands, and Robonaut is able to use the same tools that human astronauts use.
At the IEEE Humanoids conference last December, I spoke with GM researcher Muhammad E. Abdallah, who explained how Robonaut's hands work:
The Robonaut's hands work a bit differently than similar humanlike robot hands. Existing tendon-driven robotic fingers typically control their joints using tension controllers on each tendon. In other words, desired joint torques are translated into desired tendon tensions. The problem is that, in this approach, there's a coupling between the tendon and joint displacement that results in disturbances in the movement of the fingers. NASA and GM engineers solved the problem by implementing a joint-based torque control method. It decouples the tendon effects and is faster and more reliable than traditional methods.
The ability to control torque is important for Robonaut, and other humanoid robots, for that matter, because its hands will interact with unexpected objects or items slightly out of position. Industrial robots, by contrast, interact with known objects in well-defined spaces. Robonaut's hands mimic human hands in their ability to adapt to variation -- a capability that NASA demonstrated by having different people shake hands with the robot.
But the robot is more than just arms and hands, of course. Robonaut 2 weighs in at 150 kg [300 lbs] and if you're wondering, it has no legs -- it will remain stationary inside the ISS, although NASA researchers have been experimenting with robotic legs and wheels. Built primarily with aluminum with steel parts, it carries over 350 sensors and has a total of 42 degrees of freedom.
Behind its helmet visor are four visible light cameras: two provide stereo vision for the robot and remote operators, and two work as auxiliary cameras. A fifth infrared camera is housed in the mouth area for depth perception. Because the head is full of cameras, the robot's computer system -- 38 PowerPC processors -- are housed inside the torso. Or as NASA puts it, Robonaut 2 "thinks with its stomach -- literally." See this cool infographic that SPACE.com prepared:
In a second phase of the Robonaut project, at an undecided date, NASA will be making the unit mobile using a leg-type system, giving it the ability to move around inside the ISS. The third phase will feature a robot that will perform missions outside the space station. Robonaut is also a part of Project M, which wants to put a humanoid robot on the moon in 1,000 days -- beating Japan’s proposed goal of 2015.
For now, all eyes will be locked on the space shuttle at Cape Canaveral. It's been a long wait for this launch. And once Robonaut arrives at the ISS, it might take several months until astronauts unpack it and bring it to life. Still, I find the idea of a robot in space -- a staple of science fiction -- truly exciting. What do you think? Is this the beginning of a new era in robotic space exploration?
PS: Watch the "movie trailer" NASA prepared about the "new recruit."
Images: NASA; videos: IEEE Spectrum and NASA; infographic: SPACE.com