Video Friday: AMBIDEX Cable-Driven Robot Arm, and More

Your weekly selection of awesome robot videos

5 min read

Evan Ackerman is IEEE Spectrum’s robotics editor.

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

AMBIDEX cable-driven robot arm
Image: NAVER Labs

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

HRI 2019 – March 11-14, 2019 – Daegu, South Korea
Nîmes Robotics Festival – May 17-19, 2019 – Nîmes, France

Let us know if you have suggestions for next week, and enjoy today’s videos.

AMBIDEX is a robot arm resulting from collaborative R&D on human-robot coexistence. The arm uses innovative cable-driven mechanisms that make any interaction with humans safe.

[ NAVER Labs ]

Somehow we missed this video from November but here’s Cassie Blue equipped with a torso full of lidars (!) and wearing sneakers (!!). The robot was taken to a construction site to map out the building that will be the new home of the University of Michigan’s robotics program. On the way there Cassie blew a capacitor in its left leg and needed a little help from its human friends. But this kind of experiment is a good example of the potential of legged robots to do useful work in real-world environments.

[ Michigan Robotics ]

Using the fossil and fossilized footprints of a 300-million-year-old animal, an interdisciplinary team that includes scientists from EPFL and Humboldt-Universität zu Berlin have developed a method for identifying the most likely gaits of extinct animals and designed a robot that can recreate their walk. This innovative study of animal biomechanics using robots can help researchers better understand how vertebrate locomotion evolved over time.

EPFL put together a nifty interactive website that lets you compare the gaits of real animals, which you can check out at the link below.

[ Nature ] via [ EPFL ]

Using a computer system wired similarly to animal brains, a four-legged, dog-like robot successfully “learned” to run faster and recover from falls in various positions, a skill not previously observed in other four-legged robots of its kind, a new study finds. The advancement may pave the way to real-world applications such as climbing stairs to carry heavy loads in construction sites, inspecting unstructured underground tunnels and exploring other planets.

Jemin Hwangbo and colleagues trained a neural network (or computer system) through multiple RL simulations, which they then transferred into an existing medium dog-sized robot named ANYmal. The training sessions were run nearly 1,000 times faster than in real time and were conducted on a personal computer with a single processor, operating more efficiently and costing less compared to comparable networks. Importantly, ANYmal broke its previous speed record by 25% and followed velocity commands more accurately than existing technologies that have been used to control ANYmals, the authors say. The robot was also able to flip over from falls, a feat that requires a high level of coordination and control of momentum.

[ Science ]

KUKA partner, Life Science Robotics, has developed a revolutionary rehabilitation therapy system that uses a KUKA LBR Med robot. The robot is named ROBERT® and it is developed for rehabilitation of bedridden patients. The purpose of ROBERT® is to identify needs and to make a difference for patients, healthcare workers and society. With our LBR Med as a main component, ROBERT® is the first robot in the world that is custom-made for the purpose of taking care of rehabilitation of bedridden patients.

This seems like a great application, since the human does the clever things, and the robot can do the boring repetition very, very well.

[ Life Science Robotics ]

The World Is Not Enough (WINE) is a concept for a new generation of spacecraft that takes advantage of In-Situ Resource Utilization (ISRU) to explore space. WINE mines to extract water from planetary regolith, capturing the water as ice in a cold trap and heating it to create steam for propulsion. By propulsively "hopping" from location to location, WINE can explore Solar System bodies as well as individual bodies (e.g. WINE could cover much greater distances on Europa or the Moon than a rover, and can reach otherwise inaccessible regions). And by refueling itself as it goes, WINE’s range is not limited by consumables. This makes WINE particularly well suited to prospecting and reconnaissance missions.

This video shows a series of tests performed on a proof-of-concept WINE prototype vehicle at Honeybee Robotics. The vehicle demonstrates several of the primary operations that would be required of the WINE spacecraft including: mining and heating regolith to extract water; capturing water as ice in a cold trap; reorienting the vehicle to allow for further mining; pushing captured water into a propulsion tank; and heating propellant to create steam for thrust. All systems demonstrated are fully functional. All tests are conducted with regolith simulant in a vacuum chamber.

[ Honeybee Robotics ]

Okay, I’m impressed. Even more than I’m normally impressed with Hinamitetu’s robots. Wow.

[ Hinamitetu ]

Team BlackSheep is doing something practical for once, by sending airmail by drone over the Swiss Alps, a distance of 100km.

[ Team BlackSheep ]

A half-scale version of the ExoMars rover, called ExoMars Testing Rover (ExoTeR), manoeuvred itself carefully through the red rocks and sand of 9x9 m Planetary Utilisation Testbed, part of ESA’s Planetary Robotics Laboratory in its ESTEC technical centre in the Netherlands. This was a test of autonomous navigation software destined for ESA’s ExoMars 2020 mission to the red planet.

[ ESA ]

We wrote about these clever little robotic toys a while back, but this new demo is worth watching.

[ Sony Toio ]

The Digital Farmhand comprises of a small mobile platform that can be remotely or autonomously controlled. On the mobile platform exists a smartphone, sensors, and computing. The robot also has a three-point-hitch system which allows the use of farming implements to do activities such as precision seeding, spraying and weeding; and, through its ability to monitoring individual plants, the data it produces has the potential to support better on-farm decision making helping growers increase yield and productivity, reduce input costs, and maximise nutrition security. In this video, we travelled to Samoa to trial the robot on three different farms and conducted a workshop with local farmers to get feedback on how a system like Digital Farmhand could be used in the region.

[ Digital Farmhand ]

Misty Robotics has always said that they’re going to be relying on developers to come up with useful applications for Misty, but they’re doing some work themselves, too.

[ Misty Robotics ]

Safe autonomous navigation of micro air vehicles in cluttered dynamic environments is challenging due to the uncertainties arising from robot localization, sensing and motion disturbances. This paper presents a probabilistic collision avoidance method for navigation among other robots and moving obstacles, such as humans.

I was hoping that by the end, we’d have seen at least one collision. Oh well.

[ TU Delft ]

We aim to enable a mobile robot to navigate through environments with dense crowds, e.g., shopping malls, canteens, train stations, or airport terminals. In these challenging environments, existing approaches suffer from two common problems: the robot may get frozen and cannot make any progress toward its goal; or it may get lost due to severe occlusions inside a crowd. Here we propose a navigation framework that handles the robot freezing and the navigation lost problems simultaneously.

[ Paper ] via [ RL_SLAM ]

Here’s some stuff that ROBOTIS has been experimenting with lately:

[ Robotis ]

UBTECH brought plenty of robots to CES, and here’s some footage of the biggest ones.


In this work, we present the integration of multiple components of a full-size humanoid robot system, including control, planning, and perception methods for manipulation tasks. In particular, we introduce a set of modules based on visual object localization, whole-body control, and real-time compliant stabilization on the robot. The introduced methodologies are demonstrated on a box lifting task, performed by our newly developed humanoid bipedal robot COMAN+.

[ Dimitrios Kanoulas ]

Watch students in CMU’s Introduction to Robotics course do a Lego Mindstorms search and rescue scenario.

[ CMU ]

The Conversation (0)