Video Friday: Robots With Airbags, Drone vs. Drone, and MIT's Jumping Cube

Your weekly selection of awesome robot videos

6 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Robot with airbag collision test with human
Image: DLR via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

HRI 2017 – March 6-9, 2017 – Vienna, Austria
IEEE ARSO – March 8-10, 2017 – Austin, Texas, USA
IEEE SSRR – March 10-13, 2017 – Shanghai, China
NYC Drone Film Festival – March 17-19, 2017 – New York, N.Y., USA
European Robotics Forum – March 22-24, 2017 – Edinburgh, Scotland
NDIA Ground Robotics Conference – March 22-23, 2017 – Springfield, Va., USA
Automate – April 3-3, 2017 – Chicago, Ill., USA
ITU Robot Olympics – April 7-9, 2017 – Istanbul, Turkey
ROS Industrial Consortium – April 07, 2017 – Chicago, Ill., USA
U.S. National Robotics Week – April 8-16, 2017 – USA
NASA Swarmathon – April 18-20, 2017 – NASA KSC, Florida, USA
RoboBusiness Europe – April 20-21, 2017 – Delft, Netherlands
RoboGames 2017 – April 21-23, 2017 – Pleasanton, Calif., USA
ICARSC – April 26-30, 2017 – Coimbra, Portugal

Enjoy todays videos, and let us know if you have suggestions for next week.

This is one of the best things I have ever seen. Watch until the end:

In this video we present a new safety module for robots to ensure safety for different tools in collaborative tasks. This module, filled with air pressure during the robot motion, covers mounted tools and carried workpieces. In case of a non or very slow moving robot, the safety module is able to pull back and the tool is uncovered. In our experiments we found out that we can increase the velocity up to 1 m/s with a common industrial sharp edged gripper while satisfying the requirements of the ISO/TS 15066 and retain the full functionality of the tool.

We hereby commend the brave researchers at the DLR Robotics and Mechatronics Center in Germany who frequently put their bodies on the line for robotic science. Here you go guys: 🏆

[ DLR RMC ]

Fun times with drones: Watch a human-flown drone with a red hat try, and fail, to run into an autonomous drone that’s programmed to avoid collisions:

[ ASL ]

We’ve posted about MIT’s soft robotic jumping cube before, but I don’t think we’ve ever posted a video with music that makes us feel quite this old:

Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a soft robotic cube that uses a series of spring-loaded metal tongues to jump, bounce, and roll. In this video, we watch the process of building a cube in just one minute.

MIT ]

Shakey the Robot was recognized last month with an IEEE Milestone! Here’s a neat video SRI put together to explain why Shakey is so important to the history of robotics and AI.

Meet “Shakey” – the world’s first robot to embody artificial intelligence. Shakey could perceive its surroundings, logically deduce implicit facts from explicit ones, navigate from place to place, make a plan to achieve a goal, monitor the execution of a plan in the real world, recover from errors in plan execution, improve its planning abilities through learning, and communicate in simple English. Shakey was created from 1966-72 by the Artificial Intelligence Center at Stanford Research Institute (now SRI International).

[ Shakey ] via [ The Institute ]

Jonathan Grizou writes:

I am a co-founder at Pollen Robotics, a young French startup. We recently designed a prosthesis arm for an institute in cognitive neuroscience doing research on prosthesis control via EMG (muscle electrical activity). The arm can be fully 3D-printed using conventional FDM printer, it is of human size and is open-source.

You’ll probably want to turn on the auto-translated subtitles for this one:

[ Pollen Robotics ]

Thanks Jonathan!

LittleArm is a 3D-printed, Arduino-based robot arm for makers and hobbyists. It’s “designed to be a desktop robot arm that is fun and even practical but also reliable as an educational tool in higher level robotics education.”

The project is already funded on Kickstarter, which means you missed the best deals, but you can still pledge $89 for one as long as you’re able to 3D print most of it yourself.

[ Kickstarter ] via [ LittleArm ]

Thanks Gabe!

iCub can use a “novel event-based particle filter” to track the position of a ball at over 200 hertz. Also, iCub now has a green head for some reason.

[ IIT ]

An old video of RHex showing off its springy legs:

The springiness of RHex’s legs is due to their mechanical design. In every step, the legs store energy during compression in the first part of stance and then release some of that energy during the second part of stance. While the specific design of the legs has changed through the years, every iteration of RHex has included mechanically springy legs.

[ Kodlab ]

I guess Ford was feeling left out because everyone else has been making wild and crazy urban delivery-drone promises and they haven’t yet. Better do something about that, right?

[ Ford ] via [ Engadget ]

It’s always impressive to see how fast cobots are evolving and learning to do stuff out in the real world. Here’s a Sawyer montage:

[ Rethink Robotics ]

Night operations are more challenging than during the day, and are often times more critical. Using a Draganflyer drone equipped with a SureFire searchlight to light up the ground below, Search and Rescue (SAR) operations become much more efficient. Unlike infrared sensors, the bright light can rapidly communicate with the public safety personnel on the ground, identifying a missing person or a subject or interest.

[ Draganfly Innovations ]

TurtleBot 3 goes to Duckietown:

[ Robotis ] via [ MIT Duckietown ]

Helping to fight fires on ships seems like a worthwhile job for drones.

[ BERISUAS ]

Montgomery Blair High School in Silver Spring, Md., has some really talented STEM-minded students, so I guess it’s not surprising that the U.S. Navy has drafted them to build robots:

This is what the underwater robotic gliders look like out in the wild:

[ MCPS ]

On recent summer afternoons on Mars, navigation cameras aboard NASA’s Curiosity Mars rover observed several whirlwinds carrying Martian dust across Gale Crater.

[ JPL ]

A good idea from Europe: Are you a small or medium business who wants to know what robots can do for you? Go to a Robotics Innovation Facility and play with some robots, meet with smart people, and see what can happen.

[ RIF ]

This is not why I’m concerned about urban delivery drones:

But maybe it should be...

[ Kickstarter ]

With a cruising altitude of 65,000 feet and an endurance of 30 hours, NASA’s Global Hawk is the ideal platform for figuring out what water vapor does in our atmosphere:

[ ATTREX ]

This week’s CMU RI Seminar: Katsushi Ikeuchi, Principal Researcher, Microsoft Research Asia:

Tangible heritage, such as temples and statues, is disappearing day-by-day due to human and natural disaster. In e-tangible heritage, such as folk dances, local songs, and dialects, has the same story due to lack of inheritors and mixing cultures. We have been developing methods to preserve such tangible and in-tangible heritage in the digital form. This project, which we refer to as e-Heritage, aims not only record heritage, but also analyzes those recorded data for better understanding as well as displays those data in new forms for promotion and education. This talk consists of three parts. The first part briefly covers e-Tangible heritage, in particular, our projects in Cambodia and Kyushu. Here I emphasize not only challenge in data acquisition but also the importance to create the new aspect of science, Cyber-archaeology, which allows us to have new findings in archaeology, based on obtained digital data. The second part covers how to display a Japanese folk dance by the performance of a humanoid robot. Here, we follow the paradigm, learning-from-observation, in which a robot learns how to perform a dance from observing a human dance performance. Due to the physical difference between a human and a robot, the robot cannot exactly mimic the human actions. Instead, the robot first extracts important actions of the dance, referred to key poses, and then symbolically describes them using Labanotation, which the dance community has been using for recording dances. Finally, this labanotation is mapped to each different robot hardware for reconstructing the original dance performance. The third part tries to answer the question, what is the merit to preserve folk dances by using robot performance by the answer that such symbolic representations for robot performance provide new understandings of those dances. In order to demonstrate this point, we focus on folk dances of native Taiwanese, which consists of 14 different tribes. We have converted those folk dances into Labanotation for robot performance. Further, by analyzing these Labanotations obtained, we can clarify the social relations among these 14 tribes.

[ CMU RI ]

The Conversation (0)