Video Friday: Russian Android, Swarm User Interface, and Robot Drone Man

Your weekly selection of awesome robot videos

7 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Russia's humanoid robot Fedor
Russia's humanoid robot Fedor.
Image: Russia's Foundation for Advanced Studies via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. After two weeks of amazing videos from IROS, we’re back with our regular Video Friday edition.

We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

ICSR 2016 – November 1-3, 2016 – Kansas City, Kan., USA
Social Robots in Therapy and Education – November 2-4, 2016 – Barcelona, Spain
Distributed Autonomous Robotic Systems 2016 – November 7-9, 2016 – London, England
HRI 2016 – November 15-17, 2016 – Cancun, Mexico
AI-HRI – November 17-19, 2016 – Arlington, Va., USA
Humans, Machines, and the Future of Work – December 05, 2016 – Houston, Texas, USA
RiTA 2016 – December 11-14, 2016 – Beijing, China
WAFR 2016 – December 18-20, 2016 – San Francisco, Calif., USA

Let us know if you have suggestions for next week, and enjoy today’s videos.

I really don’t know much about this beyond what’s in the video, but we don’t see a lot of Russian robots around here, so:

It looks to be a project from the Russian equivalent of DARPA, designed to go into space by 2021.

[ RIA ]

Swarm user interfaces [are] a new class of human-computer interfaces comprised of many autonomous robots that handle both display and interaction. We describe the design of Zooids, an open-source open-hardware platform for developing tabletop swarm interfaces. The platform consists of a collection of custom-designed wheeled micro robots each 2.6 cm in diameter, a radio base-station, a high-speed DLP structured light projector for optical tracking, and a software framework for application development and control. We illustrate the potential of tabletop swarm user interfaces through a set of application scenarios developed with Zooids, and discuss general design considerations unique to swarm user interfaces.

Adorable. ADORABLE.

Paper ]

This video of Agile Justin feeling up different kinds of rods was a finalist for both “IROS Best Paper on Cognitive Robotics” and “IROS Best Student Paper”:

In this video we show that material classification purely based on the spatio-temporal signal of a flexible tactile skin mounted on the finger tip of the advanced humanoid robot Agile Justin can be robustly performed in a real world setting. We develop a convolutional deep learning network architecture which is directly fed with the raw 24000 dimensional sensor signal of the tactile skin. The network with its 16 million weights is trained from only 540 samples and reaches a classification accuracy of up to 97.3%. Robust material classification with a tactile skin using deep learning, by S. Baishya and B. Bäuml from DLR.

DLR ]

Thanks Berthold!

This video presents a new, modularized design concept for soft robots based on a bottom-up approach by assembling units. This concept enables the structures and motions of soft robots to be rapidly prototyped and revised to create new designs that can accomplish different tasks.

This is cool because until this point, building pneumatic robots required making molds and casting custom parts. If you can instead just build up whatever you want from modular parts, it’s going to be a lot easier to do research and development.

IEEE RAM ]

This could easily be the best thing you see all week:

To understand the previous video, you’ll need to watch the following video:

It all makes sense now!

KAIST ]

And now, this:

And this:

Takanishi Lab ]

We introduce Rovables, a miniature robot that can move freely on unmodified clothing. The robots are held in place by magnetic wheels, and can climb vertically. The robots are untethered and have an onboard battery, microcontroller, and wireless communications. They also contain a low-power localization system that uses wheel encoders and IMU, allowing Rovables to perform limited autonomous navigation on the body. In the technical evaluations, we found that Rovables can operate continuously for 45 minutes and can carry up to 1.5N. We propose an interaction space for mobile on-body devices spanning sensing, actuation, and interfaces, and develop application scenarios in that space. Our applications include on-body sensing, modular displays, tactile feedback and interactive clothing and jewelry.

I have dreams about robots crawling all over me like this, but usually they’re doing things that are a lot less practical. That’s normal, right?

Paper ] 

The Warthog is Clearpath’s new UGV. The vehicle itself is an existing platform that Clearpath has roboticized and ROSified, and it’ll tackle pretty much anything you can throw it at:

The video doesn’t show it, but we’re told that this thing is actually amphibious, in that it floats and will do a reasonable job of not sinking.

Clearpath ]

Just slightly bigger than a U.S. quarter, Piccolissimio is probably the smallest controllable, self-powered flying robot in existence:

Unlike quadrotors, which estimate the direction of gravity with accelerometers and gyroscopes hundreds of times per second then actuate its four motors, we designed Piccolissimo to use its dynamics and aerodynamics to keep gravity in a desired direction. This means Piccolissimo does not require any extra motors to orient it. Its single motor is attached to a propeller on one side, and the body of the vehicle on the other. Since every action has an equal and opposite reaction, when the motor spins the propeller it also spins the body in the opposite direction. The body has stabilizers built into it, which act like another set of propellers. If Piccolissimo travels through the air, the stabilizers that spin into the wind, called the advancing blades, see extra wind and generate extra lift. The stabilizers the move away from the wind see less wind, so they generate less lift. This creates a torque, which tries to make Piccolissimo roll. But Piccolissimo is also a gyroscope, and since gyroscopic precession is felt 90 degrees from where torque is applied, the vehicle pitches instead of rolls. This pitching slows down the vehicle, causing it to hover.

To make Piccolissimo steer, we moved the motor slightly off center. This makes a continuous torque the tries to flip the vehicle. Since Piccolissimo spins quickly, about 40 to 50 times per second, this torque never gets the chance to actually turn Piccolissimo over. Instead, the direction of the torque changes and ends up cancelling itself out. If we pulse the motor at the speed of the body’s rotation, then we can make this torque not cancel throughout a revolution. By changing the phase of the pulsing we can change the direction of torque generation, allowing Piccolissimo to steer in any direction.

Modlab ]

If you need an arm for research or education, especially if there’s an HRI component involved, Sawyer is probably worth a serious look:

Rethink Robotics ]

Drone delivery: I hate it, except in situations exactly like this, where Zipline is delivering critical medical supplies to a rural hospital in less than 15 minutes. Making this delivery by car would take 3 hours.

This time lapse was captured today by Zip. Zip is our small robot airplane designed for a high level of safety, using many of the same approaches as commercial airliners. It can carry vaccines, medicine, or blood. Our fleet of Zips provides for a population of millions. No roads, no problem! In this video, you can see why Rwanda is commonly called the Land of a Thousand Hills. Zip transits safely above the hills, then descends for delivery in a spiraling helix. After delivery, Zip passes under a thunderhead and powers through a passing rainstorm.

[ Zipline ]

DESCARO (DEformable Shape CAsting RObot) is the “world’s first shadow art performing robot,” if you don’t count all the robots that have just done their own thing under dramatic direct lighting:

I kind of want a garden eel robot now.

[ AIS Lab ]

Aurora Flight Sciences is breaking ground in the world of automated flight through its work on the Aircrew Labor In-Cockpit Automation System (ALIAS) program. On October 17 Aurora demonstrated automated flight capabilities with ALIAS flying a Cessna Caravan through basic maneuvers under the supervision of a pilot.

Developed under contract through the Defense Advanced Research Projects Agency (DARPA), ALIAS utilizes a robotic system that functions as a second pilot in a two-crew aircraft, enabling reduced crew operations while ensuring that aircraft performance and mission success are maintained or improved. In the first phase of the program, Aurora succeeded in developing a non-invasive, extensible automated system that was tested on both a simulator and in flight on a Diamond DA-42 aircraft. Under Phase II, Aurora demonstrated the adaptability of ALIAS by installing it into the Cessna Caravan. Having successfully flight tested ALIAS on two separate platforms, work on installing the integrated ALIAS system onto a third air vehicle – a Bell UH-1 helicopter – is currently underway.

[ Aurora ] via [ Engadget ]

Sparkfun’s Autonomous Vehicle Competition got a little more challenging this year, but that didn’t prevent some creative designs from trying to blast their way through it:

It’s definitely worth heading over to SparkFun’s site to check out videos of the other events as well.

[ AVC ]

At IROS this year, there was an autonomous drone racing contest. The winner was KAIST, who almost (almost!) made it to the end of the course:

[ KAIST ] via [ IROS 2016 ]

The latest von Kármán lecture at JPL comes from Aaron Parness, on “Asteroid Anchors, Rock Climbing Robots, Gecko Grippers, and Other Ways to Stick in Space.” It’s just fascinating stuff, a lot of which we’re actively covering on the blog.

The ability to rove the surface of Mars has revolutionized NASA missions. With more advanced mobility, cliff faces, cave ceilings, and the surfaces of asteroids and comets could be explored. This talk will present the work of the Robotic Rapid Prototyping Lab at NASA’s Jet Propulsion Laboratory. This includes grippers for NASA’s Asteroid Redirect Mission, which plans to extract a 15-ton boulder from the surface of an asteroid, and alter the asteroid’s orbit, a method that could prevent future impacts to the Earth. The talk will also present gecko-inspired adhesives currently being tested on the International Space Station, miniaturized robots that can drive across surfaces in zero gravity, and rock climbing robots traversing giant lava tubes in New Mexico. We will discuss not only the projects, but the new tools and techniques (3-D printers, computer-aided-design software, miniature electronics) that allow us to build and iterate robots more quickly than ever before.

[ JPL ]

The Conversation (0)