Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Video Friday: Soft Robot Impedance Control, Autonomous Rescue Drone, and RoboSimian Skating

Your weekly selection of awesome robot videos

4 min read

DLR's David humanoid
Photo: DLR

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

ICMA 2018 – August 5-8, 2018 – Changchun, China
SSRR 2018 – August 6-8, 2018 – Philadelphia, Pa., USA
ISR 2018 – August 24-27, 2018 – Shenyang, China
BioRob 2018 – August 26-29, 2018 – University of Twente, Netherlands
RO-MAN 2018 – August 27-30, 2018 – Nanjing, China
ELROB 2018 – September 24-28, 2018 – Mons, Belgium
ARSO 2018 – September 27-29, 2018 – Genoa, Italy
ROSCon 2018 – September 29-30, 2018 – Madrid, Spain
IROS 2018 – October 1-5, 2018 – Madrid, Spain

Let us know if you have suggestions for next week, and enjoy today’s videos.

Some interesting and practical work from the Autonomous Systems Lab at ETH Zurich on using drones for search and rescue applications.

ASL ]

Japan’s edition of Hebocon (the “best worst robot competition”) took place at the end of last month, I think, and while we hope to get a bunch of really good videos at some point, here’s a tiny teaser fight that’s too good not to share.

On the left, we have “Transparent Robot,” which consists mostly of a GoPro pointing backwards, and a phone that displays the video feed from the GoPro pointing forwards, making the robot “invisible.” Or, “Hebocon invisible,” anyway.

And on the right, we have (possibly badly Google translated) “Morse in the Shell,” which is powered by the wind coming out of a blown-into shell:

And this is why Hebocon is awesome.

[ Daily Portal ] via [ Robotstart ]

Soft robots equipped with variable stiffness actuators (VSA) are robust against impacts and are energetically efficient. However, due to the real springs in the joints, soft robots can oscillate even when the motors do not move. This makes the control of, for instance, the end-effector a challenging task. A new method is presented enabling to control the interaction behaviour of soft robots, which is called Elastic Structure Preserving Impedance (ESPi) control. The performance of ESPi control is demonstrated by numerous experiments on the VSA robotic arm David. This video is provided for "Elastic Structure Preserving Impedance (ESPi) Control for Compliantly Actuated Robots" published at IROS 2018 by Manuel Keppler, Dominic Lakatos, Christian Ott, and Alin Albu-Schäffer. Video by Ferdinand Elhardt and Manuel Keppler.

DLR RMC ]

This DARPA program called Polyplexus is not about robotics, exactly, but it could be helpful to roboticists (along with a lot of other folks):

[ DARPA ]

EPFL researchers, applying what they observed about insect wings, have developed a hybrid origami drone that can be stiff or flexible depending on the circumstances. When airborne, the structure is stiff enough to carry its own weight and withstand the thrust of the propellers. But if the drone runs into something, it becomes flexible in order to absorb the shock and therefore minimize any damage. This research, which is being carried out in Dario Floreano’s Laboratory of Intelligent Systems, has been published in Science Robotics.

EPFL ]

XYZ Robotics is a startup from some of the folks who rocked the Amazon Robotics Challenges, and are now applying that experience to fast and robust robotic picking.

[ XYZ Robotics ]

Back in 2016, Kayie Byl’s lab at UCSB started teaching JPL’s RoboSimian DRC robot, Clyde, how to skate:

As with most things, it’s even better at 10x:

Clyde had to go back to JPL for some upgrades, so UCSB has been working in simulation since then. They’re hoping to get the robot back at some point, though, and I’m assuming that the first thing they’ll be working on is getting it to do all of this on two wheels instead of four.

[ UCSB Robotics ]

Hinamitetu’s latest gymnastics robot is equipped with a SUPERCAPACITOR! And cute little decorations.

[ Hinamitetu ]

Well, here’s one way to do reinforcement learning with physical robots:

[ Vikash Kumar ]

From the Biomimetic Millisystems Lab at UC Berkeley in 2002 comes this micromechanical flying insect, made of stainless steel with 2 piezo actuators:

[ UC Berkeley ]

Researchers from IBM Australia have recently demonstrated a prototype for a new, AI-based system that controls artificial limbs with just a user’s thoughts. The prototype translates movement signals from the user’s brain into executable instructions for robotic appendages. This approach combines custom-developed AI code with commercially available, low-cost, off-the shelf robotic hardware.

Always, always select the Oreos. Always.

[ IBM Research ]

Introducing mobile robots into the collaborative assembly process poses unique challenges for ensuring efficient and safe human-robot interaction. Current human-robot work cells require the robot to cease operating completely whenever a human enters a shared region of the given cell, and the robots do not explicitly model or adapt to the behavior of the human. In this work, we present a human-aware robotic system with single-axis mobility that incorporates both predictions of human motion and planning in time to execute efficient and safe motions during automotive final assembly. We evaluate our system in simulation against three alternative methods, including a baseline approach emulating the behavior of standard safety systems in factories today. We also assess the system within a factory test environment. Through both live demonstration and results from simulated experiments, we show that our approach produces statistically significant improvements in quantitative measures of safety and fluency of interaction.

[ MIT ]

The humanoid robot ARMAR-6 collaborating with a human worker in a bimanual overhead task and performing force-based bimanual manipulation, vision-based grasping, fluent object handover, human activity recognition, natural language based human-robot dialog and interaction, navigation among many other features on a use case supplied by Ocado. In a complex maintenance task demonstration, the robot was able to recognize the need of help of a technician based on speech, force and visual information.

[ ARMAR-6 ]

If you’ve pre-ordered a Segway Loomo, this kit will let you turn it into a go-kart:

[ Segway ]

The 2018 Robotics Science and Systems (RSS) conference took place at CMU at the end of last month, and there are lots of videos now online, including keynotes, invited talks, and research presentations. We only have room for two here: the first is an early career spotlight talk from Maya Cakmak at UW.

And the second talk is a full keynote from Jun Ho Oh at KAIST.

[ RSS 2018 ]

The Conversation (0)