Video Friday: Cyborg Athletes, Drone Drop Test, and Robot Makes Sandwiches

Your weekly selection of awesome robot videos

8 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Video Friday: Cyborg Athletes, Drone Drop Test, and Robot Makes Sandwiches
Photo: ETH Zurich/Alessandro Della Bella

Tons of videos this week, and lots of super long ones down at the end. Since we have a bit of time off next week, consider pacing yourself on the videos so that you have something to distract yourself from family time next Friday. Or, you know, just go ahead and binge, we don’t judge. Anyway, moving on...

Video Friday is your weekly selection of awesome robotics videos, collected by your thankful Automaton bloggers. We’re also going to start posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

World Robotics Conference 2015 – November 23-25, 2015 – Beijing, China
Dronetech – November 26, 2015 – Bristol, U.K.
IREX 2015 – December 2-5, 2015 – Toyko, Japan
RoboUniverse Shanghai – December 8-10, 2015 – Shanghai, China
RoboUniverse San Diego – December 14-16, 2015 – San Diego, Calif., USA
ASU Rehabilitation Robotics Workshop – February 8-9, 2016 – Tempe, Arizona, USA
HRI 2016 – March 7-10, 2016 – Christchurch, New Zealand
WeRobot 2016 – April 1-2, 2016 – Miami, Fla., USA
National Robotics Week – April 2-10, 2016 – United States
Portuguese Robotics Festival – May 4-8, 2016 – Bragança, Portugal
Advanced Robotics Systems and Competitions – May 06, 2016 – Bragança, Portugal
Innorobo 2016 – May 24-26, 2016 – Paris, France
Automatica 2016 – June 21-25, 2016 – Munich, Germany
RoboCup 2016 – June 30-4, 2016 – Leipzig, Germany


Let us know if you have suggestions for next week, and enjoy today’s videos.

Cybathlon, “a championship for pilots with disabilities who are using advanced assistive devices including robotic technologies,” will be held next October in Zurich, Switzerland. There was a rehearsal event this summer, which provided footage for a trailer, and IEEE Spectrum’s Eliza Strickland will have a detailed article about the games to be published in January.

[ Cybathlon ]

“Hebocon, a robot sumo contest for ‘the technically ungifted,’ was held during Tokyo Design Week 2015 and featured lots of ‘crappy robots’ that made the audience laugh out loud.”

I do not understand this at all and I LOVE IT.

[ Hebocon ] via [ Japan Times ]

Why do I not have a robot throwing food into my mouth right now?

[ Dash ]

The Robot Operating System is eight?! Hi there, my name is Evan, and I’m SUPER OLD. Also, see if you can tell how many of the robots in this montage are running ROS:

[THERE WILL BE A VIDEO HERE SOON. WE HOPE.]

[ OSRF ]

Even robots need some fresh air now and then.

And yes, safety first: always use the robot lane.

[ Yale ]

Are there any tasks that Baxter and Sawyer can’t do? Sure there are, but you won’t see them in this video montage:

Also, Rethink Robotics has released version 3.3 of their Intera software. Here’s what’s new:

[ Rethink Robotics ]

How NASA gets robots off of loading docks:

Do not try this with your robot.

[ Super Ball Bot ]

Delft Dynamics’ DroneCatcher system can snag drones with nets fired out of a cannon, but then what? Here’s a test of a parachute drop system so that your drone can bring its prey back to you:

And if you missed the video of the catch, it’s definitely worth watching again:

[ Delft Dynamics ]

Thinking of specializing in robotics at UPenn?

Okay, how about now?

[ GRASP Lab ]

Robot, make me a sandwich!

You can’t quite tell from the video, but for $2, you get to choose peanut butter plus any combination of four different toppings, including honey, blackberry jam, sweet chili (?), and chocolate sauce. How much to get one for my kitchen?

[ Bistrobot ] via [ Bernalwood  ]

Thanks Tim!

The problem with using robots during a new car reveal is that everyone stops caring about the, uh, car.

This was done by Aerial Mob, who usually flies drones with cameras on them for TV and movie shoots. And the car in the back is an Infiniti QX30, but seriously, who cares.

[ Aerial Mob ]

“Due to poor visual acuity, it is difficult for humans to accurately perform micrometer-order manipulation with vision. To solve this problem, we propose to improve the human manipulation performance through robotic cooperation by a high-speed robotic module.

The robotic module consists of 1,000 fps high-speed vision and high-speed actuators to realize robust visual supervision and fine accommodation to human coarse manipulation. As a result, the system enables human to realize high precision manipulation while the workload can be reduced and the working efficiency can be improved. As a demonstration task, micrometer-order peg-and-hole alignment is presented here. The robotic module drives the workpiece with the hole (with a diameter of 70 μm) on it. It aligns the hole with the peg (with a diameter of 50 μm), which is held by the human, by adapting its position with micrometer-order accuracy.

This technology can be useful in many industrial applications. For example, it can be used in the precision assembly and fabrication industry to improve the production quality and efficiency of human workers. It also can be adopted in extreme work environments such as in outer space, where light or zero gravity hinders highly accurate human manipulation. This system will be demonstrated on the coming iRex 2015 (International Robot Exhibition 2015) in Tokyo from Dec.2 to Dec.5 at Tokyo Big Sight.”

[ Ishikawa Watanabe Laboratory ]

The University of Waterloo has little autonomous cars that use LIDAR and maps to navigate themselves around racetracks. Since the video doesn’t have any audio, click here to watch it with an appropriate soundtrack added:

[ WAVE Laboratory ]

“I imagine that one day when I wake up and am lying in bed, the robot will pour me a glass of beer.” THE FUTURE.

[ KUKA ]

“Our team NimbRo Explorer solved through the mobile manipulation robot Momaro all tasks at the DLR SpaceBot Camp: taking a soil sample, finding and grasping two objects, transporting them to a basis object, assembly (placing the soil sample at a scale and inserting a battery pack) and operating a switch. The video also shows the removal of debris.”

And one more time, on RoboCam:

[ NimbRo ]

Here are things from YouTube. I do not know what they are, but they look like fun, especially the last one.

[ Kazumichi Moriyama ]

We haven’t been posting videos from Team BlackSheep because there were some questions from readers about whether or not the way in which they were flying their drones was safe or legal. This video is one of the most impressive FPV drone videos we’ve ever seen, and it mostly takes place in areas that look to be within immediate LOS of the pilot.

Can we see the outtakes?

[ Team BlackSheep ]

“The DynamixShield is an electronics board that fits onto an Arduino Due microcontroller to give you the ability to control Dynamixel smart servos and regular servos, while also providing numerous Grove and RobotGeek connectors. Grove and RobotGeek are hardware frameworks for modular sensors and actuators. There are tons of off-the-shelf modules for these two frameworks that can be plugged into the shield with a single cable. This includes everything from GPS sensors, RFID scanners, and LCD displays that are plug-and-play ready for use with the shield. This makes it very easy to build your robots by combining modules and servos.”

[ Kickstarter ] via [ Trossen ]

“What basic skills do our kids need today, to succeed in the future? Just like reading, writing and arithmetic, problem solving and innovative thinking are essential 21st century life skills our kids need. Research shows that one of the most effective ways for kids to learn problem solving is through coding & robotics. Phiro is a robotics & coding platform that teaches problem solving and computational thinking, basic skills required for the next generation. With Phiro, kids can become artists, engineers, economists, astronomers or anything they want.”

[ PHIRO ] via [ RoboHub ]

Something that Jae-Eun Park and the computer vision team at IBM Research have been working on, featuring a 3D Robotics Iris drone:

[ IBM Research ] via [ DIY Drones ]

A guest lecture from Bradford Neuman, AI Engineer, Anki, for Pieter Abbeel’s “Advanced Robotics” class at Berkeley. This goes into much more technical detail on how Anki works than we’ve seen before, and it’s well worth a watch.

And one more: A guest lecture from Liz Murphy, Senior Robot Engineer, Savioke:

[ CS 287 ]

“Recent technological advances in legged robots are opening up a new era of mobile robotics. In particular, legged robots have a great potential to help disaster situations or elderly care services. Whereas manufacturing robots are designed for maximum stiffness, allowing for accurate and rapid position tracking without contact, mobile robots have a different set of hardware/software design requirements including dynamic physical interactions with environments.

Events such as the Fukushima power plant explosion highlight the need for robots that can traverse various terrains and perform dynamic physical tasks in unpredictable environments, where robots need to possess compliance that allows for impact mitigation as well as high force capability. The talk will discuss the new mobile robot design paradigm focusing on the actuator characteristics and the impulse planning algorithms.

As a successful embodiment of such paradigm, the talk will introduce the constituent technologies of the MIT Cheetah. Currently, the MIT cheetah is capable of running up to 13 mph with an efficiency rivaling animals, and capable of jumping over an 18 inch-high obstacle autonomously.”

There ARE some amazing Cheetah outtakes and stuff, but I’m not going to tell you where they are...bwahaha! Watch the talk:

[ MIT Biomimetics ]

“In this lecture titled ‘On the Ethics of Research in Robotics’, Dr. Raja Chatila shares his reflections on this very current and engaging topic addressing both professionals in the field and the lay public affected by the commercialisation of robotic systems. Today’s robot application has reached an impressive level of capabilities and autonomous operations in sectors ranging from transport, defense, construction to medicine, among many other. The popularisation of robotics in general raises ethical questions within the general society. Will robots take our jobs? Will AI become completely autonomous and surpass the capabilities of the human mind? What about the application of autonomous lethal weapons acting without being controlled by the human hand?”

[ IJARS ]

We posted these videos a couple of weeks ago but Aldebaran told us there was an audio issue (now fixed), so we’re posting them again.

“Murray Shanahan is Professor of Cognitive Robotics in the Dept. of Computing at Imperial College London, where he heads the Neurodynamics Group. His publications span artificial intelligence, robotics, logic, dynamical systems, computational neuroscience, and philosophy of mind. He was scientific advisor to the film Ex Machina, which was partly inspired by his book ‘Embodiment and the Inner Life’ (OUP, 2010).

In this talk he describes what he sees as the main obstacles to achieving human-level artificial intelligence given the current state of machine learning, and suggests a number of ways these obstacles might be overcome. These include speculations on a) Geoff Hinton's notion of thought vectors, b) hybrid symbolic-neural approaches, and c) cognitive architectures inspired by Bernard Baars's global workspace theory.”

[ Murray Shanahan ] via [ Aldebaran Robotics ]

“Dr. Pierre-Yves Oudeyer , Researcher Director at Inria: A great mystery is how human infants develop: how they discover their bodies, how they learn to interact with objects and social peers, and accumulate new skills all over their lives. Such development is organized and progressive, and results from the complex growth processes and interaction between brain mechanisms, the physical and the social environment. Constructing robots, and building mechanisms that model such developmental processes, is key to advance our understanding of human development, in constant dialogue with human and living sciences.

I will illustrate this point by presenting robotics models of curiosity-driven learning and exploration, and show how developmental trajectories can self-organize, starting from discovery of the body, then object affordances, then vocal babbling and vocal interactions with others. In such a vision, a limited set of meta-cognitive structures allow a learner to order its own learning experiences, creating its own curriculum where skills, including the manipulation of external objects, get naturally sequenced towards increasing complexity. In particular, I will show that the onset of language spontaneously forms out of such sensorimotor development.”

[ Inria ] via [ Aldebaran Robotics ]

Now it’s time for the good stuff, hooray! A few weeks ago, the Bay Area Robotics Symposium (formerly the Stanford-Berkeley or Berkeley-Stanford Robotics Symposium) was held in Berkeley, Calif. Videos of the entire thing are now online, and you can watch the whole day of presentations right here:

[ BARS 2015 ]

The Conversation (0)