Video Friday: Skydio's Car Follow, Telexistence, and Tick-Killing Robot

Your weekly selection of awesome robot videos

Tokyo-based robotic startup Telexistence inc. unveils its first mass production prototype for Model H
Photo: Telexistence
Tokyo-based robotics startup Telexistence unveils its Model H robot and control system.
Advertisement

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RoboCup 2018 – June 18-22, 2018 – Montreal, Canada
RSS 2018 – June 26-30, 2018 – Pittsburgh, Pa., USA
Ubiquitous Robots 2018 – June 27-30, 2018 – Honolulu, Hawaii
MARSS 2018 – July 4-8, 2018 – Nagoya, Japan
AIM 2018 – July 9-12, 2018 – Auckland, New Zealand
ICARM 2018 – July 18-20, 2018 – Singapore
ICMA 2018 – August 5-8, 2018 – Changchun, China
SSRR 2018 – August 6-8, 2018 – Philadelphia, Pa., USA
ISR 2018 – August 24-27, 2018 – Shenyang, China
BioRob 2018 – August 26-29, 2018 – University of Twente, Netherlands
RO-MAN 2018 – August 27-30, 2018 – Nanjing, China

Let us know if you have suggestions for next week, and enjoy today’s videos.


Skydio’s R1 can now follow cars, while not crashing into other stuff, of course:

[ Skydio ]


Telexistence thinks that having a telepresence robot that’s humanoid on the other end will make it more natural to interact with, I’m guessing. And they’ve done a reasonable job with the design:

I have to wonder what kind of latency you get between Japan and Hawaii, though. And honestly, if I had to pick a place to experience via telepresence, “inside of a surf shop” would not be at the top of my list. But space, pretty cool.

Telexistence ]


Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have designed a fleet of autonomous boats that offer high maneuverability and precise control. These boats can be 3-D printed using low-cost hardware and materials, and could serve as models for future self-assembling, driver-less water taxis transporting people and goods from place to place.

[ MIT ]


Exoskeletons are currently being tested for development and adoption in various roles, like helping assembly line workers with repetitive movements, baggage handlers lift luggage, and helping disabled people become more mobile. At the Applied Collegiate Exoskeleton (ACE) Competition, the first of it’s kind, teams created their exoskeletons for use by rescue workers.

The exoskeletons were put through tests similar to those for entry-level firefighters. After a design review, where safety components and how long it took to suit up were assessed, the exoskeletons were fitted with a 75 pounds of weights and timed while moving through an obstacle course: on balance beam, up and down stairs, across uneven terrain, under a low-clearance beam, and dragging a 165-pound mannequin for 100 feet.

Colorado School of Mines won the overall competition, edging out the U-M host team by less than a tenth of a point on a 650-point scale. MSU took third place.

[ University of Michigan ]

Thanks Damen!


Better TickBot than me, mate.

The TickBot works by using a magnetic sensor that follows a metal wire strung along the ground. Dry ice loaded into the TickBot is released as it crawls along, and the carbon dioxide it emits causes the ticks to grab onto a piece of cloth treated with pyrethrin, a pesticide that is non-toxic to humans.

NASA Langley ]


This multi-axis 3D printer from TU Delft moves the object around instead of the plastic squirty bit:

IEEE Spectrum contributor Fan Shi asked Professor Yong-jin Liu, “What is the advantage of this method compared to normal one in which the support is fixed and nozzle is attached to the manipulator?” Profesor Liu: “The two methods are symmetrical in geometry. But physically, it is the best choice that the nozzle is always vertical to the ground.”

[ RoboFDM ]


From Japan, for those of you who prefer your horse to be perfectly stationary.

[ Japan Racing Association ]


In a new paper spearheaded by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and the University of Toronto, researchers demonstrate “VirtualHome,” a system that can simulate detailed household tasks and then have artificial “agents” execute them, opening up the possibility of one day teaching robots to do such tasks.

The team trained the system using nearly 3,000 programs of various activities, which are further broken down into subtasks for the computer to understand. A simple task like “making coffee,” for example, would also include the step “grabbing a cup.” The researchers demonstrated VirtualHome in a 3-D world inspired by the Sims video game.

The team’s artificial agent can execute 1,000 of these interactions in the Sims-style world, with eight different scenes including a living room, kitchen, dining room, bedroom, and home office.

[ MIT ]


If you haven’t noticed, we’re morally obligated to post Cassie videos every week, so here’s one from Agility Robotics itself, showing Cassie learning stairs:

Not to be outdone, the University of Michigan posted this response video, entitled “Trolling Agility Robotics:”

And what’s better than watching videos of Cassie doing stuff? Watching videos of Cassie failing to do stuff:

[ Michigan Robotics ] via [ Agility Robotics ]


Space exploration today is hindered by the size and mass of the hardware and equipment human beings are capable of launching to space, the frequency of those launches and the human safety factors involved in transferring and integrating hardware components during astronaut extravehicular activities, or EVAs.

Now, building on the latest robotic technologies available, NASA and its commercial partners seek to transform the way we manufacture, assemble and repair large structures in space, leading us closer to a robust space infrastructure freed from launch window scheduling, launch vehicle mass limitations and astronaut safety concerns. Ultimately, NASA’s new In-space Robotic Manufacturing and Assembly (IRMA) project will enable more frequent science and discovery missions in Earth orbit, across the solar system and beyond.

[ NASA Langley ]


Details on the computer vision, mobility, and motion systems that you’ll be getting when that Misty II you’ve pre-ordered shows up at your door.

[ Misty Robotics ]


The project TransFIT is part of the space road map of the DFKI RIC. The project focuses on the assembly and installation of infrastructure for space applications by humans and robots either autonomously or in cooperation. The cooperation between humans and robots follows the concept of “sliding autonomy”. This means that the control over a robot by a human can be very strong as it is the case during teleoperation, weaker as in case of teleoperation with an autonomous control of components or like supervision only in case of “operator in the loop” approaches. The goal of the human-robot interaction is not only task sharing but further training of robots enabling more complex autonomous behaviour.

[ DFKI ]


This is some of the most fluid and lifelike robot motion I’ve ever seen, and of course it comes from Disney.

[ Disney ] via [ Gizmodo ]


Single stream recycling, enabled by your friendly robot sorting system.

[ Zen Robotics ]


2020 is going to be an exciting year for robots on Mars, and the ESA’s ExoMars rover will be conducting the most detailed search for Martians yet.

ESA’s (The European Space Agency) ExoMars rover is headed to the red planet in 2020, on a mission to search for signs of past or present life. One of its primary tools in this endeavor is MOMA, the Mars Organic Molecule Analyzer. MOMA is a sophisticated suite of technologies that squeezes a lab full of chemistry equipment into a package the size of a toaster oven. Its mass spectrometer subsystem and main electronics were built at NASA’s Goddard Space Flight Center in Greenbelt, Md, and mark the first use of a linear ion trap on another planet - a leap forward in the search for life beyond Earth.

[ ExoMars MOMA ]


Wyss Core Faculty member, Robert Wood, Ph.D., presents a talk titled “The Mechanical Side of Artificial Intelligence”. Artificial intelligence typically focuses on perception, learning, and control methods to enable autonomous robots to make and act on decisions in real environments. With our research, which focuses on the design, mechanics, materials, and manufacturing of novel robot platforms that make the perception, control, or action easier or more robust, we aim to facilitate this decision-making process in natural, unstructured, and often unpredictable environments. Key principles in this pursuit include bio-inspired designs, smart materials for novel sensors and actuators, and the development of multi-scale, multi-material manufacturing methods.

[ Harvard Wyss Institute ]


On this week’s episode of Robots in Depth, Per interviews Spring Berman from Arizona State University.

Spring Berman talks about her extensive experience from research in the field of swarm robotics. One of the underlying ideas of the area is designing robot controls similar to the ones used in nature by different types of swarms of animals, systems that work without having a leader. We get to hear how many robots can be used together to handle tasks that would not be possible using one or a small number of robots. We also get introduced to the opportunities of mixing artificial animals with real ones. Spring describes some of the challenges within swarm robotics, which can be as diverse as mathematical modelling and regulatory issues. She also comments on the next frontier of research and the different research areas that are relevant to advance this area.

[ Robots in Depth ]


Robotics News

Biweekly newsletter on advances and news in robotics, automation, control systems, interviews with leading roboticists, and more.

About the Automaton blog

IEEE Spectrum’s award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.