Robot Videos: Robot With Bird Legs, DARPA Manipulation Demo, and Darwin-OP's Dance Moves

In this edition of our video-filled Friday post, we present you a robot with bird-inspired legs, a demo of DARPA's ARM program, and a Darwin-OP humanoid learning to play Dance Dance Revolution

2 min read

Evan Ackerman is IEEE Spectrum’s robotics editor.

Robot Videos: Robot With Bird Legs, DARPA Manipulation Demo, and Darwin-OP's Dance Moves

See this? It's a robot with bird legs. Genius. This, plus a manipulation arm from DARPA and a Darwin-OP humanoid getting learning to play Dance Dance Revolution, in this edition of our more or less reliable (it's less) video-filled Friday post.

First up, and for the life of me I don't know how we missed this thing at IEEE International Conference on Intelligent Robots and Systems back in September, is this quadrotor with giant spindly bird legs from the Utah Telerobotics Lab. It's a passive system that uses the weight of the robot to actuate the grippers, so that when the quadrotor lands on something it latches on and stays. To release, the robot just lifts off, and the grippers let go. Birds use a very similar system to keep themselves perched while asleep, and the leg and toe design was inspired directly by songbirds.

Incidentally, the quad does fly with these things hanging off of it: you can see a video of it in the air (but not perching) here.

[ Utah Telerobotics Lab ]

Last July, Darwin-OP had just started learning how to use a Dance Dance Revolution-stye pad, and we're happy to report that Paul Fredrickson at Purdue has managed to teach the robot to visually recognize symbols from the game and respond to them with the appropriate pad actuation. We've been promised more detail in a follow-up vid in the near future, and we're keeping our fingers crossed for an operational version that can destroy humans at a real round of DDR some time soon.

[ Purdue ]

And finally we've got this video from DARPA's ARM program. ARM stands for Autonomous Robotic Manipulation, and the program has been running for several years now using relatively inexpensive off-the-shelf hardware to perform human-ish tasks like opening doors and using tools. In open-source tests from November of last year, the best teams achieved a 93 percent success rate in grasping both modeled and "unmodeled" objects.


The Conversation (0)