Crusher robotic vehicle drives itself, obliterates cars but not our reporter

Update: Some folks have had problems watching the QuickTime videos -- man, I did test this on Firefox running on both Mac OS X and Windows XP and it worked for me, but apologies anyway. And here's the solution: you can now watch the full video of the Crusher field trials without QuickTime.

 

 

Last week, our correspondent Sally Adee went to Fort Bliss, Texas, to attend the field trials of DARPA's 6.5-ton unmanned, autonomous off-road vehicle, Crusher, developed by Carnegie Mellon's National Robotics Engineering Center.

 

On the obstacle course, Crusher proved true to its name by obliterating old cars like a monster truck. But it's also smart enough to avoid certain obstacles in its path. Obstacles like our intrepid reporter herself!

 

Watch the video to see how Crusher approaches Adee, inspects her with its creepy batting LIDAR "eyes," recognizes her as something it shouldn't destroy, and then backs off.

 

How does the monster vehicle do that?

 

Sally Adee explains:

 

 

The batting "eyes" you see are LIDAR (which was explained to me as being "invisible laser beams"). Crusher uses it in combination with radar and optical cameras to sketch out the topography around it and in its path, and compare it to its database of known objects, to then draw its own conclusions about where to go and where not to go.

 

 

For example, when it comes across a big boulder, it analyzes the material, its reflectivity, and its shape. Then, having been loaded with pre-fab specifications of what constitutes a boulder, it decides on whether to back off, go around, or roll over the obstacle.

 

This is all on the fly-- all processing takes place on board in real time inside Crusher's brain. So what exactly is inside that brain?

 

To process the incoming "sensory" data stream, there are eight blade servers with a total of 32 cores of computing just to process all the images from the laser scanner and the stereo cameras.

 

Right behind the eyes, Crusher's autonomy system consists of 38 CPUs: commercial off-the-shelf chips like Xilinx FPGAs and Intel and AMD processors. This is how it figures out where to go and how to get there.

 

Just keep in mind that this monstrosity was constructed almost entirely at Carnegie Mellon by undergrads, grad students, professors, staff engineers and industry people (they did pretty much everything but load up the chassis). Now that's an engineering curriculum.

 

 

 

PS: Adee is preparing a complete video report on the trials -- it will be on Spectrum's site this Wednesday, check back. But if you want more Crusher now, continue reading this post for a video of the vehicle smashing two cars like sardine cans.

Advertisement

Automaton

IEEE Spectrum's award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York, N.Y.
Senior Writer
Evan Ackerman
Berkeley, Calif.
 
Contributor
Jason Falconer
Canada
Contributor
Angelica Lim
Tokyo, Japan
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement