Video Friday: Rescue Robot, Gesture Control, and 1986 Self-Driving Van

Your weekly selection of awesome robot videos

6 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Gesture-controlled robot arm with Myo armband
Image: FZI via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

Distributed Autonomous Robotic Systems 2016 – November 7-9, 2016 – London, England
HRI 2016 – November 15-17, 2016 – Cancun, Mexico
AI-HRI – November 17-19, 2016 – Arlington, Va., USA
Humans, Machines, and the Future of Work – December 05, 2016 – Houston, Texas, USA
RiTA 2016 – December 11-14, 2016 – Beijing, China
WAFR 2016 – December 18-20, 2016 – San Francisco, Calif., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.

The 2016 U.S. Robotics Roadmap was released this week; it’s a massive document authored by 150 roboticists that’s intended to help frame and guide research and policy decisions with the goal of solving societal problems in the United States. We’ll be taking a closer look at it, but here’s a 30-minute summary from lead editor Henrik Christensen:

[ Robotics Roadmap 2016 ] via [ UCSD ]

ANYmal has been looking impressively capable lately:

The legged robot ANYmal can support disaster relief teams with safer search and rescue operations. With its advanced locomotion capabilities, ANYmal can operate in rough outdoor environments, crawl through pipes, and access buildings over steps and stairs. With help of laser sensors and thermal cameras, the robot can be used to check the safety of buildings and search for potential victims.

If you want one, ANYMal will be for sale soon, as will its integrated modular actuators.

[ ANYBotics ] via [ ETHZ ]

Thanks Péter!

According to Google Translate, this method of rubble traversal by a quadruped is called the “tummy move.” I like it:

WAREC-1, from Waseda University in Japan, can also climb ladders (!):

[ Waseda ]

Parrot’s Bebop 2 drone now comes with some fancy visual recognition modes that help the drone autonomously follow and film you:

These features are unlockable from within the app for $20. Personally, I like this model, since it keeps the cost of the drone down, but also promotes continued development.

[ Parrot ] via [ Engadget ]

Clearpath Robotics is officially not really announcing a brand new lawnmower robot:

[ Clearpath ]

I wouldn’t want to try to drive a boat through Arctic ice, and neither do captains of ships who have to do it to retrieve anchors from mooring sites. Aerovel’s Flexrotor UAV acts as a spotter aircraft to help find safe routes through the ice:

Video from the aircraft became the most compelling show onboard, especially for the seasoned ice pilot responsible for navigating the ship to its targets. Video was also followed with keen interest by web viewers in Alaska and the “lower 48,” streamed in real-time through the boat’s satellite link. Occasionally their view was clouded by fast-forming fog – a common problem that, together with the cold and distance from help, makes manned-aircraft reconnaissance a daunting proposition over the Arctic. Low visibility would have put a helicopter’s crew in danger, but Flexrotor simply returned to the fog-shrouded ship, landed automatically, and waited for the skies to clear. The whole reconnaissance operation was much safer and more practical than any manned-aircraft option, and less costly.

[ Aerovel ]

I’m not sure how well the Myo gesture-recognition armband is doing for the average consumer, but for roboticists, it’s proving to be an excellent way to intuitively control robot arms:

6DoF Schunk PowerBall LWP4 robotic arm and SVH anthropomorphic hand are controlled over Bluetooth using the Myo armband. Both position and orientation of the Myo sensor are used to move the arm and place the hand close to the object of interest. Myo’s EMG sensors detect muscle activity of the user’s grasp, release gestures and trigger grasp motion of the robotic hand. Collisions with the table are avoided by using dynamic kinematic restrictions, so that the hand always stays over the table surface. The system can quickly adapt to work with any user in under one second thanks to the very fast calibration of the Myo armband.

[ FZI ]

Thanks Arne!

It’s certainly taking a while, but NASA is steadily working towards a system that will allow drones to safely fly out of line-of-sight:

The “out of sight” tests, led by NASA in coordination with the Federal Aviation Administration and several partners, were the latest waypoint in solving the challenge of drones flying beyond the visual line of sight of their human operators without endangering other aircraft. They were part of NASA’s Unmanned Aircraft Systems (UAS) traffic management (UTM) research platform.

UTM’s Technical Capability Level Three testing is planned for January 2018 and will involve evaluating tracking procedures for managing cooperative and uncooperative drones to ensure collective safety of manned and unmanned operations over moderately populated areas. Technical Capability Level Four, planned for 2019, will involve higher-density urban areas for autonomous vehicles used for newsgathering and package delivery, and will offer large-scale contingency mitigation.

[ NASA ]

From MIT’s Personal Robotics Group:

In this project, we wanted to understand how the robot’s vocal expressiveness might impact children’s engagement and learning. The robot actively encouraged children to process a story through dialogic reading. We asked whether the robot’s effectiveness was critically dependent on the expressive characteristics of the robot — specifically, the robot’s voice.

[ MIT PRG ]

Anti-drone operations don’t get much more autonomous than this system:

[ Airspace ] via [ Laughing Squid ]

Boing boing boing!

[ ETHZ ]

MARLO seems to be getting out more, although I was hoping for a little extra fall in the video, if you know what I’m saying.

[ University of Michigan ]

This video shows the RIPPA robot working on several commercial vegetable farms around Australia. Various experimental autonomous crop interaction tasks are demonstrated including autonomous row following, deep learning and 3D image reconstruction, autonomous real time mechanical weeding, autonomous real time variable rate fluid dispensing using VIIPA, autonomous soil sampling and mapping.

[ Australian Centre for Field Robotics ]

NASA gave a group of Community College Aerospace Scholars 48 hours and $600 million to construct a rescue rover to retrieve some stranded Mars rovers. And in case it wasn’t obvious from the 48 hours and $600 million, this was a simulation.

College students built small prototype Mars rovers at a competition Oct. 16 through 19 at NASA’s Armstrong Flight Research Center’s Office of Education at the AERO Institute in Palmdale, California. The competition was a workshop coordinated and implemented through the NASA Community College Aerospace Scholars, or NCAS program. Forty students from around America participated. Each competition involved scoring points on the activities. In the first competition, students had to design a contraption to retrieve rocks that could scoop them up and automatically take them to their home base. The other challenge had the students rescuing stranded rovers from a mock Martian boulder field. Again, the rovers had to drive automatically to their targets, retrieve them, taking them automatically back to their base. The competitions also included planning a simulated budget for the missions.

Simulated budgets: the easiest kind of budgets to stick to.

[ NASA ]

Two more videos from CMU illustrating how far autonomous vehicles have come in the last few decades.

Terregator (1983): Terregator has deployed three kinds of sensors: video cameras, a sonar ring, and a scanning laser range finder. The cameras give reflectance information, the sonar measures distance, and the laser scanner senses both. Using these sensing modes, the Terregator has successfully navigated sidewalks and off- road areas, and has mapped a portion of a mine. The Terregator supports evolution into fuller navigation systems with path planning and mapping capabilities. The extensible options provided by an adaptable vehicle are invaluable to the research setting. This project is committed to making significant advancement towards an autonomous vehicle capable of mobile tasks in an outdoor environment.

NavLab 1 (1986): NavLab or Navigation Laboratory was the first self-driving car with people riding on board. It was very slow, but for 1986 computing power, it was revolutionary. NavLab continued to lay the groundwork for Carnegie Mellon University’s expertise in the field of autonomous vehicles.

[ CMU ]

Should robots feel pain? It’s a complicated problem, because the idea of “pain” is all tied up in emotions, which robots don’t have.

The upshot seems to be that no, robots aren’t going to “feel pain,” but we’ll program them to react to physical stimuli.

[ Cambridge ] via [ Motherboard ]

As with most TED Talks, there’s nothing really new or amazing in this one about the potential for a driverless world, but the biological analogy is a bit interesting:

What if traffic flowed through our streets as smoothly and efficiently as blood flows through our veins? Transportation geek Wanis Kabbaj thinks we can find inspiration in the genius of our biology to design the transit systems of the future. In this forward-thinking talk, preview exciting concepts like modular, detachable buses, flying taxis and networks of suspended magnetic pods that could help make the dream of a dynamic, driverless world into a reality.

[ TED ]

The Conversation (0)