Video Friday: Boston Dynamics' Spot Goes to Work, and More

Your weekly selection of awesome robot videos

5 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Boston Dynamics' Spot Mini
Image: Boston Dynamics via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

Japan Robot Week – October 17-19, 2018 – Tokyo, Japan
The Promise and the Peril of Artificial Intelligence and Robotics – October 23, 2018 – Corvallis, Oregon, USA
Collaborative Robots, Advanced Vision & AI Conference – October 24-25, 2018 – Santa Clara, Calif., USA
ICSR 2018 – November 28-30, 2018 – Qingdao, China

Let us know if you have suggestions for next week, and enjoy today’s videos.

We already posted about the Atlas doing parkour video, which Marc Raibert first showed at IROS earlier this month; he also showed this video, which is just as interesting (if not quite as dramatic), since it shows SpotMini in what could be its first realistic commercial application.

We have begun field testing the Spot robot for commercial usage around the world. After an initial mapping run, Spot autonomously navigated two dynamic construction sites in Tokyo and used a specialized payload for surveying work progress. An additional camera in its hand lets Spot do even more detailed inspection work on site. The Spot robot will be available in the second half of 2019 for a variety of applications.

[ Boston Dynamics ]

We’re training Aquanaut to be an Remotely Operated Vehicle! Normal ROVs require operators with specialist training and a joystick. Not Aquanaut. We gave one of our staff engineers an iPad and let him run the vehicle through our ROV task panel replica in the test tank.

So, what do we think that blurred out bit is? Maybe a logo? My guess would be S.H.I.E.L.D.

[ Houston Mechatronics ]

What is a robot? There’s a big debate, even among top roboticists, about what defines a robot. Is a dishwasher a robot? How about an automatic door? Or a car’s cruise control? Now add froyo machines to the list:

What do you think? Robot or not robot?

[ Reis & Irvy’s

The German drone manufacturer Wingcopter, Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH and DHL just completed a successful medical drone delivery pilot project in Tanzania: More than 180 take-offs and landings, over 2,200 km flown and roughly 2,000 flight minutes recorded.

Incidentally, IEEE Spectrum will be in Mwanza in just a few weeks to check out some delivery drones, Wingcopter included.

[ Wingcopter ]

Thanks, Cameron!

Cassie takes a few experimental hops to see how things go. The awkward hip motion on the final jump is due to minimal ability to actuate in mid-jump by virtue of not having an upper torso (or tail).

Tail!?

[ Agility Robotics ]

We introduce MetaArms, wearable anthropomorphic robotic arms and hands with six degrees of freedom operated by the users legs and feet. Our overall research goal is to re-imagine what our bodies can do with the aid of wearable robotics using a body-remapping approach. To this end, we present an initial exploratory case study. MetaArms two robotic arms are controlled by the users feet motion, and the robotic hands can grip objects according to the users toes bending. Haptic feedback is also presented on the users feet that correlate with the touched objects on the robotic hands, creating a closed-loop system. We find that MetaArms demonstrate the feasibility of body-remapping approach in designing robotic limbs that may help us re-imagine what the human body could do.

[ UIST ]

Project Wing has gone a little quiet lately, but here’s one of their drones—with 12 rotors (a dodecarotor?) for vertical takeoff, plus two props on its wings, allowing it to fly like a fixed-wing plane:

Yeah, still not seeing much in the way of sense-and-avoid, or any way of dealing with “dog grabs thing dangling from air and plays tug of war.”

[ Project Wing ] via [ RobotStart ]

We address a class of manipulation problems where the robot perceives the scene with a depth sensor and can move its end effector in a space with six degrees of freedom – 3D position and orientation. Our approach is to formulate the problem as a Markov decision process (MDP) with abstract yet generally applicable state and action representations. Finding a good solution to the MDP requires adding constraints on the allowed actions. We develop a specific set of constraints called hierarchical SE(3) sampling (HSE3S) which causes the robot to learn a sequence of gazes to focus attention on the task-relevant parts of the scene. We demonstrate the effectiveness of our approach on three challenging pick-place tasks (with novel objects in clutter and nontrivial places) both in simulation and on a real robot, even though all training is done in simulation.

[ Paper ]

Thanks, Marcus!

Raytheon seems to be overly excited about its new high-energy laser weapons system.

Looks like that little Slaughterbots problem has been solved! Although the scariest part was right at the end, where the autonomous killer drones inexplicably decide to flee. They’re self-aware!

[ Raytheon ]

Tasks in outdoor open world environments are now ripe for automation with mobile manipulators. The dynamic, unstructured and unknown environments associated with such tasks -- a prime example would be collecting roadside trash -- makes them particularly challenging. In this paper we present an approach to solving the problem of picking up, transporting, and dropping off novel objects outdoors. Our solution integrates a navigation system, a grasp detection and planning system, and a custom task planner. We perform experiments that demonstrate that the system can be used to transport a wide class of novel objects (trash bags, general garbage, gardening tools and fruits) in unstructured settings outdoors with a relatively high end-to-end success rate of 85%.

[ arXiv ]

Thanks, Brayan!

The Mars 2020 rover is starting assembly, but before that could happen, it needed to be painted. And since this is NASA, you know that even something like paint is both complicated and interesting.

[ Mars 2020 ]

GECKO is an agile and dexterous robot designed to conduct remote Level 1 ship tank and damage control inspections, which reduces labor costs and human exposure to dangerous environments. A demonstrator of the GECKO system has been developed, and is currently undergoing evaluation, with the goal of transitioning to the Navy fleet, including surface ships, submarines, and carriers.

[ IAI ]

This drone display from Ehang to celebrate 40 years of Chinese economic reform is one of the better ones I’ve seen.

[ Ehang ]

The Roboat competition turned 2, happy...activation day? They’ve made a lot of progress in two years, and the project still has three to go.

[ Roboat ]

Pepper is ruthless as a quizmaster.

[ TheAmazel ]

At IMTS 2018 FANUC America demonstrates the super heavy payload capability of the FANUC M-2000iA robot, which is able to easily lift this complete Chevrolet Bolt vehicle efficiently and safely. This system also features autonomous vehicle charging via the flexible FANUC M-10iD/12 robot mounted on a Self-Driving Vehicle.

[ Fanuc ]

Kurt Leucht, a software team lead specializing in in-situ resource utilization and robotics at NASA’s Kennedy Space Center, gave a sort of virtual talk on “using robots to find resources.” It’s targeted at high school students, which means that the rest of us can understand most of it too.

[ NASA ]

Marimba-playing robot Shimon (and cyborg drummer Jason Barnes) in concert at the Tivoli performing center, Utrecht, in the Netherlands.

[ GA Tech ] via [ TechCrunch ]

The Conversation (0)