Automaton iconAutomaton

Little Amphibious Tumbling Robot Tackles Tough Terrain

As components get smaller, robots are getting smaller as well, but in general small robots have big problems with obstacles and rough terrain. We've seen a variety of examples of robots that have found ways around this problem (most notably robots that jump or fly), but this might be the most creative yet: it's a robot that tumbles.

By "tumble," I mean that this robot is designed to move by flipping itself end-over-end in a somersaulting motion. It's called Aquapod, and it was created by the University of Minnesota's Center for Distributed Robotics. Aquapod uses two carbon fiber arms connected to servo motors that can rotate continuously to, as the researchers put it, "induce a tumble."

The reason that it's called Aquapod, incidentally, is that it's also waterproof, with the ability to control its buoyancy, floating or sinking or even just chilling out somewhere in the water column. This enables it to operate quite happily on land as well as in water, where it can sink itself to the bottom of lakes and streams and tumble along the bottom.

Aquapod might not be the fastest robot ever, but it has no trouble tumbling over slippery surfaces, through sand, and towards skeptical ducks. The offset arms help to give it more degrees of freedom to escape from vegetation and other obstacles, trading a little bit of efficiency for increased robustness.

The general intent is for Aquapod to be used in water monitoring or aquatic sensor deployment, where bunches of them can team up to float down rivers, sinking and floating and deploying sensors and taking measurements as they go. It would even be possible to stick one underneath an iced-over lake to monitor fish populations during the winter, where the robot could move around by "inverse tumbling" on the underside of the ice while upside-down.

Next up will be to work in solar power along with autonomous control for long-duration research. Even without any of that stuff the robot is still a very promising platform, though, since it's estimated to cost only about $2,000 to build.

Aquapod was presented in an ICRA paper entitled "Aquapod: Prototype Design of an Amphibious Tumbling Robot," by Andrew Carlson and Nikos Papanikolopoulos from the Center for Distributed Robotics at the University of Minnesota.

Robot Uses Supersonic Jets of Air to Stick to Almost Anything

There are all kinds of ways to stick to a surface, but one of the simplest is to use a gripper that operates on the Bernoulli principle. All the Bernoulli principle says is that as a liquid moves faster, its pressure decreases. For the purpose of a robotic gripper, air counts as a fluid, and if you squirt air out around the edges of a circular gripper fast enough, it'll start to generate a vacuum force that's strong enough to grab things without the surface of the gripper actually needing to touch them. The major upside of this technique is that you get a non-contact vacuum grip, so it's useful for grabbing stuff that's sterile or fragile.

Image via http://www.engineeringtalk.com/news/man/man150.html
Image: Bosch Rexroth AG

While Bernoulli grippers are fine for picking up things, they're not generally strong enough to enable a robot to support its own weight, much less climb. A research group from the University of Canterbury in New Zealand has developed a supersonic version of the Bernoulli gripper that's five times stronger than the conventional version, which is enough to allow a robot to climb on a bunch of different surfaces. And when you're watching this video, keep in mind that unlike pretty much every other climbing robot in existence, the grippers on this robot aren't touching the wall:

The geometry of the new type of air gripper (or non-contact adhesive pad, abbreviated NCAP) that this robot uses has been carefully designed to use a tiny little gap (a mere 25 μm in size) to force the airflow to go supersonic. Or more precisely, Mach 3. This doesn't require an increase in airflow or in pressure, it's all done with the geometry of the gripper itself compressing the airflow and speeding it up:

This airflow creates a low pressure vortex inside the gripper which provides the actual adhesion force, and in testing on the robot, this supersonic gripper is able to support five times as much weight as a conventional Bernoulli gripper, all without using any additional air volume or pressure.

As for the robot itself, it may be used for industrial inspections. The supersonic non-contact grippers will be available in "some months" for "a few hundred dollars," the researchers say.

The robot was presented in an ICRA paper entitled "An Investigation into Improved Non-Contact Adhesion Mechanism Suitable for Wall Climbing Robotic Applications," authored by Matthew Journee, XiaoQi Chen, James Robertson, Mark Jermy, and Mathieu Sellier.

ICRA 2011 Expo Gallery

While most of ICRA was devoted to research presentations, there was a lively expo floor stuffed with robots that would be from all corners of the globe, if a globe had any corners. We're nowhere near finished with our coverage of the research, but for today, enjoy this gallery of pics from the expo:

Guarding the entrance to the expo hall were Yaskawa's light saber dueling robots, which as far as I can tell were battling with each other non-stop for five days straight. Obviously, they'd had lessons.


Adept is now selling the AQUA2 research robot, which is a commercial derivative of the RHex platform designed for underwater operations. 


Also from Adept was this intelligent robot kiosk, which incorporates a touchscreen display on a mobile base with autonomous navigation.


The Shadow C6M Smart Motor Hand is touted as the most advanced dextrous robot hand in the world, boasting 24 movements which allow for direct mapping from human hand motions onto the robot hand.


Willow Garage managed to ship a PR2 all the way to Shanghai in a crate, where it immediately (well, almost immediately) got busy picking up squishy turtle toys and putting them down again. 


The reason for PR2's turtle toys was all the publicity surrounding the recent debut of TurtleBot, which took part in an impromptu educational robot parade along with Kuka's youBot and Aldebaran Robotics' Nao


DARwIn-OP kept itself busy all week kicking a red ball around and forcing people to dodge out of the way of its exceptionally determined object-tracking algorithms.


Looking for a robot to do something dirty or dangerous? Shanghai's own xPartner Robotics Company has you covered.


TurtleBot may not be the fastest robot on the block, but its spacious upper deck is good for riding around on, as Nao discovered.


 

This is actually one of the very first production versions of the Kuka youBot. Pretty sexy, even in orange. 


Stay tuned for more ICRA coverage

PR2 Learns to Read, Can't Pronounce 'Robot' (UPDATE: Yes I Can, Says PR2)

UPDATE: Menglong Zhu, the UPenn researcher who taught their PR2 robot to read, contacted us to say that the robot, named Graspy, took issue with our headline. Graspy claims it can pronounce "robot" and sent us the audio to prove:

Teaching a robot to read out in the wild is no easy task, thanks in large part to the propensity of graphic designers (along with us normal people) to use a bewildering number of different fonts and colors to better communicate creative vision, mood, or just general boredom with Helvetica.

The University of Pennsylvania's GRASP Lab has conquered these factors, along with such things as variable lighting and distance, and has gotten their PR2 (named "Graspy") to wander around, reading things non-stop in a monotone and perhaps slightly confused voice. This newfound literacy will be available for download for both PR2s and generalized ROS platforms, which means that you can give your robot a huge brain upgrade and vastly increase its interactive capabilities with just a few simple clicks.

Two questions remain unanswered, though: can it read Wingdings, and will it, on principle, read something written in Comic Sans?

[ GRASP Lab ]

READ ALSO:

PR2 Robot Can Scan And Bag Your Groceries
Wed, May 11, 2011

Blog Post: Stanford University has their PR2 picking up items in a checkout line, scanning them, and putting them in a bag for you

How Robots Can Learn From Your Pathetic Failures
Wed, May 18, 2011

Blog Post: Getting a robot to do what you want is never an easy task, especially if you can't even do the task yourself

Lingodroid Robots Invent Their Own Spoken Language
Tue, May 17, 2011

Blog Post: These little robots make up their own words to tell each other where they are and where they want to go

Awesomely Bad Ideas: Teaching a Robot to Sword Fight
Fri, May 13, 2011

Blog Post: Georgia Tech has given a robot a sword and told it that humans are out to get it, all in the name of safety

Robot Film Festival in NYC This July

Robots seem to inspire people to make awesome movies and videos of all kinds, which is why it's high time that someone went out and put together a Robot Film Festival. If you're in New York City on July 16 and 17, you should definitely go, or better yet, you should submit your own video and be a part of the show, which will apparently include red carpet, an awards ceremony, and cocktails.

Here's a little teaser of what you might expect to see at the Robot Film Festival. I've seen a lot (like, seriously, a lot) of robot videos, and this is easily one of the weirdest:

That's a real robot! And it hates alien eggs! Impressive, yeah?

The submission deadline for the festival is June 5, and everything else you need to know is at the link below.

[ Robot Film Festival ]

Nao Robots Dance the Macarena Better Than You

The Nao, the little French humanoid whose software just became open source, is always learning new tricks. We've seen it showing off Michael Jackson moves, doing Star Wars impressions, and performing an 8-minute synchronized dance choreography. Now a trio of Nao robots is busting out some Latin dance moves with a Macarena performance that makes the uncoordinated among us more humiliated than ever now that even machines dance better than us.

The routine was created as part of a computer science course taught by Rudolf Jaksa and Maria Vircikova from the Center for Intelligent Technologies at the Technical University of Kosice, in Slovakia. Their students programmed Nao robots to perform a variety of dances (if you have a Nao, you can download the Choregraphe source files here). One of the students, Boris Raus, from Croatia, created the Macarena routine. "Programming the robots to dance," Vircikova says, "is an entertaining way for students to learn and implement algorithms that explore aesthetic motion, human-robot interaction, and creativity."

High-Speed Robot Hands Fold a Towel in 0.4 Second

Remember those crazy fast robotic hands that can dribble a ball in the blink of an eye? A research group from the University of Tokyo has been teaching them to fold towels (very small towels) at blistering speed, poking some fun at Berkeley's PR2 and its rather more, um, sedate pace.

What the researchers figured out was that if you move something deformable (like a piece of cloth) fast enough, it'll just follow the motion path of whatever it's attached to, and you don't have to worry about niggling little annoyances like the effects of gravity. Using this method, it's possible to calculate the path that the cloth will take, enabling a robot to fold super fast it as in the video above.

These high speed hands were able to fold a cloth in half in an average of 0.4 second with a success rate of about 80 percent, but researchers hope to improve that with the addition of an improved visual feedback system (similar to the one they use to scan a book just by flipping its pages) that will be able to tell the hands exactly when to close. Eventually, the hope is to teach the hands to fold a more versatile range of objects, along with crazier things like high-speed origami.

This research was presented by Yuji Yamakawa, Akio Namiki, and Masatoshi Ishikawa of the University of Tokyo and Chiba University, in their ICRA paper entitled "Motion Planning for Dynamic Folding of a Cloth with Two High-speed Robot Hands and Two High-speed Sliders."

[ Ishikawa Oku Lab ]

READ ALSO:

Treebot Learns to Autonomously Climb Trees
Wed, May 18, 2011

Blog Post: Not even trees can save you from this inchworm-inspired climbing robot

How Robots Can Learn From Your Pathetic Failures
Wed, May 18, 2011

Blog Post: Getting a robot to do what you want is never an easy task, especially if you can't even do the task yourself

Lingodroid Robots Invent Their Own Spoken Language
Tue, May 17, 2011

Blog Post: These little robots make up their own words to tell each other where they are and where they want to go

Awesomely Bad Ideas: Teaching a Robot to Sword Fight
Fri, May 13, 2011

Blog Post: Georgia Tech has given a robot a sword and told it that humans are out to get it, all in the name of safety

Treebot Learns to Autonomously Climb Trees

This is Treebot. As you might expect, Treebot was designed to do one thing: climb trees. It is by no means the first robot able to do this, but its arboreal predecessors (RiSE and Modsnake and accidentally PackBot are just a few) weren't autonomous and didn't have the skills necessary to negotiate the complex network of branches that you tend to find on trees worth climbing.

The design of Treebot is fairly unique: it uses a set of flexible linear actuators connecting two gripping claws to allow it to move around like an inchworm. While the back gripper holds on, the front gripper releases and the body extends forward, allowing the robot to literally feel around for a good place to grip.

Keeping to the inchworm theme, the robot doesn't use much in the way of fancy sensors. Instead, it's all tactile. You can tell the robot which direction you'd like it to go and how far, and the robot will grope its way to its destination, adaptively navigating from trunk to branches.

At the moment, Treebot is more or less blind. This isn't necessarily a problem, but it could get where it wants to go much faster if it's able to tell which branches have the highest potential to allow it to efficiently climb higher up, so researchers are working on ways to help Treebot optimize its climbing path.

TreeBot was designed by Tin Lun Lam and Yangsheng Xu from The Chinese University of Hong Kong, and their research was presented at ICRA last week in a paper entitled "Treebot: Autonomous Tree Climbing by Tactile Sensing."

How Robots Can Learn From Your Pathetic Failures

Robots that can learn from demonstrations are capable of watching a human do something, and then copying (or even improving on) the motions that the human makes in order to learn new tasks. This is fine if you're good at the task that you're trying to teach the robot, but if you're bad at it, you and your robot student are going to run into some problems.

Daniel H. Grollman and Aude Billard from the Learning Algorithms and Systems Laboratory at EPFL, in Lausanne, Switzerland, are working on ways for robots to learn from demonstrations, even if those demonstrations are failures. In the following video, a human shows a robot how to prop up a block and toss a ball into a basket without actually succeeding at either task:

The researchers developed learning algorithms that allow the robot to analyze your behavior and mathematically determine what parts of the task you're getting right (or you think you're getting right) and where you're screwing up, and eventually, it teaches itself to perform the task better than you. At the moment, the robot isn't using an adaptive learning approach; it's just trying different things until it accomplishes the objective. But part of the appeal of this system is that it uses failed human examples to help it know the extent of what it should try. I can almost hear a robotic voice saying, "Human, it's okay to fail."

Grollman and Billard describe their work in a paper, "Donut As I Do: Learning From Failed Demonstrations," presented last week at the IEEE International Conference on Robotics and Automation (ICRA), in Shanghai, and they were honored with the Best Cognitive Robotics Paper award. Congrats!

[ Post updated to correct for the fact that the robot can't yet infer what your overall goal is... But they're working on it! ]

Lingodroid Robots Invent Their Own Spoken Language

lingodroids language robots

When robots talk to each other, they're not generally using language as we think of it, with words to communicate both concrete and abstract concepts. Now Australian researchers are teaching a pair of robots to communicate linguistically like humans by inventing new spoken words, a lexicon that the roboticists can teach to other robots to generate an entirely new language.

Ruth Schulz and her colleagues at the University of Queensland and Queensland University of Technology call their robots the Lingodroids. The robots consist of a mobile platform equipped with a camera, laser range finder, and sonar for mapping and obstacle avoidance. The robots also carry a microphone and speakers for audible communication between them.

To understand the concept behind the project, consider a simplified case of how language might have developed. Let's say that all of a sudden you wake up somewhere with your memory completely wiped, not knowing English, Klingon, or any other language. And then you meet some other person who's in the exact same situation as you. What do you do?

What might very well end up happening is that you invent some random word to describe where you are right now, and then point at the ground and tell the word to the other person, establishing a connection between this new word and a place. And this is exactly what the Lingodroids do. If one of the robots finds itself in an unfamiliar area, it'll make up a word to describe it, choosing a random combination from a set of syllables. It then communicates that word to other robots that it meets, thereby defining the name of a place.

lingodroids language robots

From this fundamental base, the robots can play games with each other to reinforce the language. For example, one robot might tell the other robot “kuzo,” and then both robots will race to where they think “kuzo” is. When they meet at or close to the same place, that reinforces the connection between a word and a location. And from “kuzo,” one robot can ask the other about the place they just came from, resulting in words for more abstract concepts like direction and distance:

lingodroids language robots
This image shows what words the robots agreed on for direction and distance concepts. For example, “vupe hiza” would mean a medium long distance to the east.

After playing several hundred games to develop their language, the robots agreed on directions within 10 degrees and distances within 0.375 meters. And using just their invented language, the robots created spatial maps (including areas that they were unable to explore) that agree remarkably well:

lingodroids language robots

In the future, researchers hope to enable the Lingodroids to "talk" about even more elaborate concepts, like descriptions of how to get to a place or the accessibility of places on the map. Ultimately, techniques like this may help robots to communicate with each other more effectively, and may even enable novel ways for robots to talk to humans.

Schulz and her colleagues -- Arren Glover, Michael J. Milford, Gordon Wyeth, and Janet Wiles -- describe their work in a paper, "Lingodroids: Studies in Spatial Cognition and Language," presented last week at the IEEE International Conference on Robotics and Automation (ICRA), in Shanghai.

READ ALSO:

Awesomely Bad Ideas: Teaching a Robot to Sword Fight
Fri, May 13, 2011

Blog Post: Georgia Tech has given a robot a sword and told it that humans are out to get it, all in the name of safety

Google Shows Us Why We All Need Robot Cars
Fri, March 04, 2011

Blog Post: Robot cars can drive like maniacs, as Google demonstrates, but it's all in the name of safety

PR2 Robot Can Scan And Bag Your Groceries
Wed, May 11, 2011

Blog Post: Stanford University has their PR2 picking up items in a checkout line, scanning them, and putting them in a bag for you

Little Rolling Robot Transforms Into Helicopter
Mon, May 16, 2011

Blog Post: How do you get a ground robot over an obstacle? Just turn it into a helicopter

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement
Load More