Video Friday: iCub Does Yoga, Wooden Walking Robot, and Wind Tunnel for Drones

Image: RobotCub Project via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your non-flexible Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

International Collaborative Robots Workshop – May 3-4, 2016 – Boston, Mass., USA
ICARSC 2016 – May 4-6, 2016 – Bragança, Portugal
Robotica 2016 – May 4-8, 2016 – Bragança, Portugal
ARMS 2016 – May 9-13, 2016 – Singapore
National Manufacturing Week – May 11-13, 2016 – Sydney, Australia
ICRA 2016 – May 16-21, 2016 – Stockholm, Sweden
NASA Robotic Mining Competition – May 18-20, 2016 – NASA KSC, Fla., USA
Skolkovo Robotics Conference – May 20, 2016 – Skolkovo, Russia
Innorobo 2016 – May 24-26, 2016 – Paris, France
RoboCity16 – May 26-27, 2016 – Madrid, Spain
RoboBusiness Europe – June 1-3, 2016 – Odense, Denmark
Dynamic Walking 2016 – June 4-7, 2016 – Holland, Mich., USA
IEEE RAS MRSSS 2016 – June 6-10, 2016 – Singapore
CR-HRI – June 6-10, 2016 – Orlando, Fla., USA
NASA SRRC Level 1 – June 6-11, 2016 – Worcester, Mass., USA
Field Robot Event – June 14-18, 2016 – Haßfurt, Germany
RSS 2016 – June 18-22, 2016 – Ann Arbor, Mich., USA
European Land Robot Trial – June 20-24, 2016 – Eggendorf, Austria
Automatica 2016 – June 21-25, 2016 – Munich, Germany
ISR 2016 – June 21-22, 2016 – Munich, Germany
UK Robotics Week – June 25-1, 2016 – United Kingdom
Hamlyn Symposium on Medical Robotics – June 25-28, 2016 – London, England


Let us know if you have suggestions for next week, and enjoy today’s videos.


Friend o’ the blog Markus Waibel sent us this video of the craziest ETH Zurich Flying Machine Arena project (so far): the monocopter, which has one prop and nothing else. We’re told that Markus bet that this thing could only work in theory, and lost:

This video introduces the monospinner, the mechanically simplest controllable flying machine in existence. It has only one moving part (the rotating propeller). The vehicle features no additional actuators or aerodynamic surfaces.

The monospinner cannot hover like a standard multicopter. However, an unconventional equilibrium is found by analyzing the vehicle’s dynamics. For a certain constant angular speed and propeller force, the monospinner is able to remain substantially in one position. Feedback control keeps the vehicle near this equilibrium.

The mechanical design is chosen based on two robustness metrics: the ability to maintain hover under perturbations and the probability of input saturation based on a stochastic model. The resulting vehicle is sufficiently robust to achieve hover after being launched like a Frisbee.

Developed by Weixuan Zhang, Mark W. Mueller, and Raffaello D’Andrea.

This reminds us of another spinning drone design, Lockheed Martin’s Samarai:

And if that’s not crazy enough, here’s a massive multicopter powered by multi-stage rockets, because why not.

[ FMA ]

Thanks Markus!


Want to deep-learningify your robot? Plug in this USB stick and get to work:

The new Fathom Neural Compute Stick is the world’s first embedded neural network accelerator. With the company’s ultra-low power, high performance Myriad 2 processor inside, the Fathom Neural Compute Stick can run fully-trained neural networks at under 1 Watt of power.

Thanks to standard USB connectivity, the Fathom Neural Compute Stick can be connected to a range of devices and enhance their neural compute capabilities by orders of magnitude. Neural Networks are used in many revolutionary applications such as object recognition, natural speech understanding, and autonomous navigation for cars. Rather than engineers programming explicit rules for machines to follow, vast amounts of data are processed offline in self-teaching systems that generate their own rulesets. Neural networks significantly outperform traditional approaches in tasks such as language comprehension, image recognition and pattern detection.

When connected to a PC, the Fathom Neural Compute Stick behaves as a neural network profiling and evaluation tool, meaning companies will be able to prototype faster and more efficiently, reducing time to market for products requiring cutting edge artificial intelligence.

“As a participant in the deep learning ecosystem, I have been hoping for a long time that something like Fathom would become available,” said Founding Director of New York University Data Science Center, Dr. Yann LeCun. “The Fathom Neural Compute Stick is a compact, low-power convolutional net accelerator for embedded applications that is quite unique. As a tinkerer and builder of various robots and flying contraptions, I’ve been dreaming of getting my hands on something like the Fathom Neural Compute Stick for a long time. With Fathom, every robot, big and small, can now have state-of-the-art vision capabilities.”

Fathom allows developers to take their trained neural networks out of the PC-training phase and automatically deploy a low-power optimized version to devices containing a Myriad 2 processor. Fathom supports the major deep learning frameworks in use today, including Caffe and TensorFlow.

[ Movidius ]


If you like Theo Jansen’s Strandbeests, you might want to have a look at this Kickstarter for an awesome looking walking robot made of wood:

It’s a beautifully intricate design, and since it’s made of wood, it’s also very affordable (and very recyclable). The basic kit is just $200, and for $500, you also get gripper fangs, a tail, radios, some weaponized silly string, and an Android tablet (!).

As always, remember that you’re not buying one of these: your pledging money to help it get created. Having said that, these same guys have already delivered on an earlier Kickstarter for a non-robotic version of ZeGoBeast, so this isn’t their first wooden walking creature rodeo. And now that I’ve just said that, a wooden walking creature rodeo sounds like a lot of fun.

[ Kickstarter ]

Thanks Alex!


Here’s one of UPenn’s quadrotors doing real-time, on-board obstacle avoidance at a modest pace:

The hope, we’re told, is to crank it up to obstacle avoidance that’s a little more like this:

[ UPenn ]

Thanks Sikang!


Here’s some very cool work by Andreas Hermann at FZI on dynamic collision avoidance for collaborative robots. It’s runs like the blazes on a CUDA GPU, and it stops only when it predicts a pending collision: if it thinks it won’t actually run into you, it’ll keep on going, even while you poke at it:

The software is all open source, and you can check it out for yourself on GitHub.

[ GPU-Voxels ]

Thanks Arne!


Which is better at dealing with a 22 degree slope: a bipedal robot with no sensors, or a gantry handled by a trio of grad students? The answer will not surprise you:

Thank you for not including MARLO’s violent e-stop death at the end there.

[ MARLO ]


If Cornell puts half the effort into its robot submarines as they did for this trailer about their robot submarines, they’re going to win this year’s RoboSub competition handily. I mean, flipperily.

"HAMMERTIME!" (That’s a thing Thor says, right?)

[ Cornell AUV ]


And now, the weirdest useful robotics application of the week: real-time live horse imaging:

Whoever designed this system was probably irked that horses aren’t spherical. I know I am.

[ UPenn ]


After yet more hype (most of which we chose not to waste your time with), here’s the latest video from KUKA featuring Timo Boll. Brace yourself:

Sigh. You’re better than this, KUKA. You make awesome robots that do awesome things, why are you doing all of this dumb CGI stuff? Robots have been balancing pendulums for years, make a big one and get your Titan to ACTUALLY DO THIS. Maybe not with the expensive human on top, but still.

[ KUKA ]


Last year in Seoul, KAIST’s Unmanned Systems Research Group participated in an autonomous car demo in downtown Seoul. It was raining, which made things harder, but they also completely closed off a major street, which made things easier:

[ KAIST USRG ]


iCub is somehow far, far more flexible than I am:

If only someone could cleverly program me to be less lazy and take a yoga class or something.

[ Paper ]


I think it is very very very very very important that IEEE Spectrum provides you, dear reader, with in-person coverage of this year’s RobotX competition, which incidentally is taking place in Hawaii in December:

It’ll be a sacrifice, but robots are worth it, and so are you.

[ RobotX ]


Videos like these make me realize just how far ahead of us birds are when it comes to flight control systems.

[ Stanford ]


What will happen if we teach robots to think? I have no idea, but here are what some smart people from UW Madison have to say about it:

[ Wisconsin HCI Lab ]


Michigan Robotics Day was serious business this year, and featured lectures from the likes of Wolfram Burgard and David Akin:

[ Michigan Robotics ]

Thanks Chris!


Professor Sethu Vijayakumar, Personal Chair in Robotics at the University of Edinburgh, delivers the 2015 Tam Dalyell Prize lecture entitled, Sharing Autonomy (and responsibility): The Robots Are Ready, Are You?

In this lecture, Professor Vijayakumar will look at how humans and robots will work together in the future. The next generation of robots will work much more closely with humans, other robots and interact significantly with the environment around them. With significant autonomy devolved to the robotic platforms, will we be able to share control in a way we are comfortable with?

[ University of Edinburgh ]


This week’s CMU RI Seminar is from Raj Madhavan, Founder and CEO, Humanitarian Robotics Technologies.

Robotics and automation technologies hold immense promise in transforming people’s lives across various communities around the globe. However, there exists a huge disconnect between what is possible from an engineering and scientific viewpoint and what the expectations of the general public are. The problem lies in the fact that we have not seen many practical solutions that can be deployed in a truly useful and effective fashion towards making a difference in the quality of lives of people.

In this talk, I will describe my current work focusing on the applied use of robotics and automation technologies for the benefit of under-served and under-developed communities by working closely with them to sustain developed solutions. This is made possible by bringing together researchers, practitioners from industry, academia, local governments, and various entities such as the IEEE Robotics Automation Society’s Special Interest Group on Humanitarian Technology (RAS-SIGHT), NGOs, and NPOs across the globe.

I will discuss a demining challenge that I have co-organized for the last two years with the intent of producing an open-source solution for detecting and classifying unexploded ordnance buried in minefields. I will also outline my recent efforts in the technology and public policy domains with emphasis on socio-economic, cultural, privacy, and security issues in developing and developed economies.

[ CMU Robotics Institute ]

Advertisement

Automaton

IEEE Spectrum’s award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, drones, automation, artificial intelligence, and more.
Contact us:  e.guizzo@ieee.org

Editor
Erico Guizzo
New York City
Senior Writer
Evan Ackerman
Washington, D.C.
 

Newsletter Sign Up

Sign up for the Automaton newsletter and get biweekly updates about robotics, automation, and AI, all delivered directly to your inbox.

Advertisement