Video Friday: Robot Rocket, Giant Sphero, and 3D Printed Head

And more from Festo. And iRobot. And JPL. Get ready for the weekend with Video Friday!

3 min read
Video Friday: Robot Rocket, Giant Sphero, and 3D Printed Head

Next week is U.S. National Robotics Week, of course, so really, we're just trying to blow through Friday as fast as possible and kick off a robotics-filled weekend. Sure, that describes pretty much every single Friday we've ever had, and watching a bunch robot videos is the best way that we know of to make that happen.

Let's start with . . . ROCKETS!

The testing at the Mojave Air and Space Port in California represented the first terrestrial demonstration of an autonomously guided rocket flying a planetary landing trajectory. The testing utilized Draper's GENIE (Guidance Embedded Navigator Integration Environment) system actively controlling a Masten Xombie terrestrial test rocket. GENIE is an Autonomous Guidance, Navigation, and Control (AGNC) avionics system that is the only system available today capable of precision planetary landings with real-time, autonomous trajectory planning and hazard avoidance maneuvers.

[ Draper GENIE ]

 

 

Monday was April Fool's day, in case you hadn't noticed, and here's what Sphero came up with:

What's funny is that it's not really a joke at all:

Sphero, meet the Rotundus GroundBot. Rotundus GroundBot, meet Sphero.

[ Sphero ]

[ Rotundus ]

 

 

Need a robot head? How about helping to Kickstart this one: it's called MAKI, and it's super cheap, because you can 3D print the entire thing and then rig it up with your own hardware:

Just $30 gets you the 3D printer files along with full assembly instructions.

[ Hello Robo ] and [ Kickstarter ] via [ SUYS ]

 

 

Festo's spectacular robot dragonfly from last week is back, in this video that does a great job of showing how all of the different actuators work:

Festo has also been working on this "learning gripper," which can teach itself to grip stuff:

The LearningGripper from Festo looks like an abstract form of the human hand. The four fingers of the gripper are driven by twelve pneumatic bellows actuators with low-level pressurisation. Thanks to the machine learning process, the gripper is able to teach itself to carry out complex actions such as the targeted gripping and positioning of an object.

And there's one more thing, too. It's called WaveHandling, and it's a sort of three-dimensional pneumatic conveyor belt that uses wave motion to move objects. Huh? Just watch:

The pneumatic conveyor belt can transport objects in a targeted manner and sort them at the same time. It consists of numerous bellows modules that deform the surface creating a wave motion that transports the objects in a targeted manner.

[ Festo BionicOpter ]

[ Festo LearningGripper ]

[ Festo WaveHandling ]

 

 

This robot, from Illinois Institute of Technology in collaboration with MIT's Robotic Mobility Group, uses something called "active split offset castors" to move omnidirectionally. It's a little bit weird looking, but undeniably effective:

Here it is on grass:

[ IIT Robotics Laboratory ]

 

 

MIT's Senseable City Lab teamed up with Kuka and sponsors Coca-Cola and Bacardi to design whatever this thing is:

I'm thinking it's a somehow ridiculously complicated and expensive and robotic way of making a rum and coke. Or maybe a rum and coke and a whole bunch of other stuff. We'll find out next week, when Makr Shakr goes on display in Milan.

[ Makr Shakr ]

 

 

How good is an iRobot Looj at getting ping pong balls out of your gutters? Let's find out:

Here are some suggestions for what they should try next:

  • Doritos
  • Pom-poms
  • Jell-O
  • Glitter
  • Corks
  • Thumbtacks
  • Golf balls
  • D batteries
  • iPhones
  • Skulls
  • Roombas
  • Grumpy cats

[ iRobot Looj ]

 

 

Kids, this is why building robots is cool:

Good luck, and have fun!

[ JPL ]

The Conversation (0)

How the U.S. Army Is Turning Robots Into Team Players

Engineers battle the limits of deep learning for battlefield bots

11 min read
Robot with threads near a fallen branch

RoMan, the Army Research Laboratory's robotic manipulator, considers the best way to grasp and move a tree branch at the Adelphi Laboratory Center, in Maryland.

Evan Ackerman
LightGreen

This article is part of our special report on AI, “The Great AI Reckoning.

"I should probably not be standing this close," I think to myself, as the robot slowly approaches a large tree branch on the floor in front of me. It's not the size of the branch that makes me nervous—it's that the robot is operating autonomously, and that while I know what it's supposed to do, I'm not entirely sure what it will do. If everything works the way the roboticists at the U.S. Army Research Laboratory (ARL) in Adelphi, Md., expect, the robot will identify the branch, grasp it, and drag it out of the way. These folks know what they're doing, but I've spent enough time around robots that I take a small step backwards anyway.

The robot, named RoMan, for Robotic Manipulator, is about the size of a large lawn mower, with a tracked base that helps it handle most kinds of terrain. At the front, it has a squat torso equipped with cameras and depth sensors, as well as a pair of arms that were harvested from a prototype disaster-response robot originally developed at NASA's Jet Propulsion Laboratory for a DARPA robotics competition. RoMan's job today is roadway clearing, a multistep task that ARL wants the robot to complete as autonomously as possible. Instead of instructing the robot to grasp specific objects in specific ways and move them to specific places, the operators tell RoMan to "go clear a path." It's then up to the robot to make all the decisions necessary to achieve that objective.

Keep Reading ↓ Show less