You know what's special about today's Video Friday? No drones. No UAVs. No flying robots at all. Some weeks, it's all flying robots. Most weeks, there's at least a few. But not this week. Zero. This probably means that I've somehow missed out on something cool (and there is one thing, but I'm saving it for a proper article next week). So if you only bother to show up to Video Fridays for the fliers, you can take the week off. For the rest of us, let's get to it.
MIT's Personal Robots Group has a trailer for "DRAGONS." Looks like Kombusto has been fruitful and multiplied. But though the video says "Summer 2015," we're not sure if they are serious—with Cynthia Breazeal on leave to work on Jibo (see next video), we wonder if DRAGONS is still an active project.
[ MIT's Personal Robots Group ]
I was hoping that Jibo, the personal robot that MIT's Cynthia Breazeal unveiled last month, might have a little more in common with Kombusto (like claws), but that doesn't seem to be stopping it, as the Indiegogo campaign is at $1.5 million with eight days to go. Here's an update for developers:
[ Jibo Blog ]
Another video from MIT? I guess we might as well just get them all in here in one big chunk. Here's MIT's Atlas demonstrating compliant balancing, which will come in handy when it competes in the DRC Finals next year.
[ MIT DRC ]
Meanwhile, in Atlanta, Georgia Tech's GRITS Lab is working on swarm robotics:
[ GRITS Lab ]
Two years ago, Curiositylanded on Mars. Happy anniversary! To celebrate, JPL is making sure that the rover's path to Mount Sharp is as squishy as possible:
[ Curiosity ]
DLR's Mirosurge robotic surgery platform might be a little bit old (I can't quite tell; the last paper on it may have been published in 2010), but this video is new to their YouTube channel:
[ DLR ]
Also from DLR is something that's newer and more exciting (at least to me): a video showing how a "robot can bias its feathers in the forearm comparable far flick a ball like a man." Or at least, that's what Google Translate thinks the description says. I think it's a demonstration of how their robotic musculoskeletal hand and arm can flick a ball by "loading" a finger against a thumb:
[ DLR ]
Personally, I prefer my robot vacuums to be entirely autonomous. I like being able to leave and come home and then the floor is just magically cleaner. But if you prefer a more active role, Samsung's new robot vacuum lets you pilot it around by hand with a special laser pointer:
This thing costs like US $1,300, and I have to figure that a solid 85 percent of the engineering cost was probably figuring out how to home in on the laser pointer.
Via [ Gizmodo ]
With the help of a tablet and Angry Birds, children can now do something typically reserved for engineers and computer scientists: program a robot to learn new skills. The Georgia Tech project is designed to serve as a rehabilitation tool and to help kids with disabilities.
From the press release:
In a new study, [Georgia Tech's Ayanna Howard and Harvard postdoc Hae Won Park] asked grade-school children to play Angry Birds with an adult watching nearby. Afterwards, the kids were asked to teach a robot how to play the game. The children spent an average of nine minutes with the game as the adult watched. They played nearly three times as long (26.5 minutes) with the robot. They also interacted considerably more with the robot than the person. Only 7 percent of their session with the adult included eye contact, gestures and talking. It was nearly 40 percent with the robot.
[ Georgia Tech ]
Yujin Robot's Kobuki is best known as the
base of the TurtleBot 2, but it's a capable (and cheap!) platform on its own:
[ Kobuki ]
Remember "Box," that incredible short film created with the robotic cameras from Bot & Dolly (one of the robot companies acquired by Google last year)?
Here's some behind the scenes of how they made it happen:
[ Bot & Dolly ]
These robots can see through walls using . . . WiFi??!
Imagine unmanned vehicles arriving behind thick concrete walls. They have no prior knowledge of the area behind these walls. But they are able to see every square inch of the invisible area through the walls, fully discovering what is on the other side with high accuracy. The objects on the other side do not even have to move to be detected. Now, imagine robots doing all these with only WiFi signals and no other sensors. In Mostofi's lab at UCSB, we have shown how to do this.
[ UCSB ]
AUVSI's RoboSub competition ended last weekend. Here's a summary of the competition, and the final results, which ought to be enough robot videos to get you through the rest of your Friday:
[ RoboSub 2014 ]
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.