Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Video Friday: RoboCup, Drone Magic, and NotBot Is Pedro

Your weekly selection of awesome robot videos

6 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Humanoid robots play soccer at RoboCup
Image: Robocup 2017 via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

RoboBusiness Europe – April 20-21, 2017 – Delft, Netherlands
RoboGames 2017 – April 21-23, 2017 – Pleasanton, Calif., USA
ICARSC – April 26-30, 2017 – Coimbra, Portugal
AUVSI Xponential – May 8-11, 2017 – Dallas, Texas, USA
AAMAS 2017 – May 8-12, 2017 – Sao Paulo, Brazil
Austech – May 9-12, 2017 – Melbourne, Australia
Innorobo – May 16-18, 2017 – Paris, France
NASA Robotic Mining Competition – May 22-26, 2017 – NASA KSC, Fla., USA
IEEE ICRA – May 29-3, 2017 – Singapore
University Rover Challenge – June 1-13, 2017 – Hanksville, Utah, USA
IEEE World Haptics – June 6-9, 2017 – Munich, Germany
NASA SRC Virtual Competition – June 12-16, 2017 – Online
ICCV 2017 – June 13-16, 2017 – Venice, Italy

Let us know if you have suggestions for next week, and enjoy today’s videos.

After seeing this video, I’m convinced that RoboCup 2017 in Nagoya will be the best robot competition in the history of the universe.

RoboCup 2017 ]

NASA’s humanoid robots are too big and expensive to spend their time doing human-robot interaction studies, so students at Rice University built NotBot to take their place:

It’s not at all surprising that a human in a robot suit is much more capable than an actual robot for many (if not most) applications.

Rice ]

Your Easter didn’t have enough laser robots, because if it had enough laser robots, it would have looked like this:

Special effects courtesy photochromic paint on the egg and strobe light.

[ Eggstatic 2 ]

Thanks Jiri!

No no no no no no no no no no

Professor Aaron Becker tells the story:

After a guest lecture by Stephen Kleinen of Shepherd Controls for "Intro to Robotics" @ UH, we decided to test just how safe this robot is. My dad is a dentist, so I just happened to have a toothbrush.

[ Aaron Becker ]

A slightly less terrifying video from Aaron Becker’s lab (specifically, by his student Arun Mahadev) showing how global input can be used with boundaries to create arbitrary shapes if you’re patient and clever enough:

[ University of Houston ]

Rather than manually pushing carts full of parts and equipment around its repair facility in Wisconsin, GE has enlisted the help of OTTO Motors for autonomous on-demand deliveries:

OTTO Motors ]

A slow motion camera and a robot arm teamed up to make this gorgeous advertisement for some brand of paint that I don’t remember:

And here’s the commercial itself.

[ McKinney ] via [ PetaPixel ]

As Davide Scaramuzza points out, what’s remarkable about Marco Tempest’s mini drone swarm performance is that the drones are all super cheap ($120) Parrot mini drones, localizing solely based on the unique colored tiles on the floor.

Marco Tempest ]

Mocoro is a robot from IBM Research Japan that’s designed to provide feedback to professors teaching courses online or via videoconference, when it’s difficult for them to see how students are reacting to their lectures. The idea is that if the professor is super boring, or talks too quickly, the robot will be there to fall asleep or look confused:

Without a way for the lecturer to monitor and respond to reactions, it is difficult to determine what information resonated (or bored) the students. Oftentimes, students are in a lecture hall, staring at their instructor, projected over a widescreen. And, the lecturer has little control over zooming in or out, or changing what he or she sees of the students, either. The same situation applies to the students who are joining the class from a remote location.

To solve this digital distance challenge, IBM researcher Akihiro Kosugi came up with the idea of a cognitive user interface, to act as an observation window, built into the computers connecting students and lecturer. The interface allows the lecturer to move a camera around the classroom to better interact with the students. Then, Akihiro and IBM Research intern Shogo Nishiguchi took this idea of an autonomous virtual assistant with fluid interaction between student and lecturer another step forward, and built a telepresence agent that could attend courses with students and lecturer.

The bot, called Mocoro, is designed to express, through simple facial expressions, confusion, boredom, or other body language. To help the lecture flow smoothly, Mocoro does not interrupt. Instead, the lecturer must notice Mocoro’s expressions, such as turning pale, or looking down, and ask it: “Is anything wrong?” Then, Mocoro might respond with something like: “I am sorry, but, I could not understand what you just said. Would you repeat it for me?” This way, the interaction feels like a normal classroom discussion.

IBM Research ]

A demo of a robotic arm, ADA, using shared autonomy to assist with eating in the Personal Robotics Lab at the Robotics Institute, Carnegie Mellon University.

I wonder exactly how many marshmallows those poor undergrads had to choke down to get this system to work.

[ CMU ]

The best robots are the robots that have rockets on them. NASA hopes that one day, one of these might land itself on a distant planet or moon.

Over the past five weeks, NASA and Masten teams have prepared for and conducted sub-orbital rocket flight tests of next-generation lander navigation technology through the CoOperative Blending of Autonomous Landing Technologies (COBALT) project.

The COBALT payload was integrated onto Masten’s rocket, Xodiac. The Xodiac vehicle used the Global Positioning System (GPS) for navigation during this first campaign, which was intentional to verify and refine COBALT system performance. The joint teams conducted numerous ground verification tests, made modifications in the process, practiced and refined operations’ procedures, conducted three tether tests, and have now flown two successful free flights. This successful, collaborative campaign has provided the COBALT and Xodiac teams with the valuable performance data needed to refine the systems and prepare them for the second flight test campaign this summer when the COBALT system will navigate the Xodiac rocket to a precision landing.

The technologies within COBALT provide a spacecraft with knowledge during entry, descent and landing that enables it to precisely navigate and softly land close to surface locations that have been previously too risky to target with current capabilities. The technologies will enable future exploration destinations on Mars, the moon, Europa, and other planets and moons.

[ NASA ]

The National Science Foundation and National Robotics Initiative are responsible for supporting an enormous amount of important robotics research, and these two videos highlight just a little bit of what’s been going on:

[ NSF NRI ]

You may have seen Kuka’s robots flipping bottles at SXSW this year, but here’s a bit of background on the machine learning that was used to make it work.

Kuka had a big presence at SXSW, and here’s a bit more about what they were up to.

[ Kuka ]

Besides making a solid power and data connection to your robot, a tether can also be useful for helping it explore areas that it might not be able to reach otherwise.

[ ASRL ]

The ICRA 2017 DJI RoboMasters Mobile Manipulation Challenge asks competitors “to develop a lightweight mobile manipulator that can autonomously pick, transport and stack building blocks.” Team Homer looks ready to me:

This is the qualification video of team homer of the University of Koblenz-Landau. This time we are aiming for the DJI RoboMasters ICRA Mobile Manipulation Challenge in Singapore with our new robot Grimey, named after Frank "Grimey" Grimes. The robot is equipped with a mobile mechanum platform and custom control code. A Kinova Mico robotic arm with custom grippers is used. For localization and mapping we use a Hokuyo Laserscanner. Furthermore the robot is equipped with two ASUS Xtion ProLive RGB-D cameras that have not yet been used in the qualification video.

[ Team Homer ]

Chris Urmson, the CEO and cofounder of Aurora (and previously the head of Google’s self-driving car program) gave this year’s Yata Memorial Lecture in Robotics at CMU. Somewhat disappointingly, the lecture does not reveal all about Aurora.

[ Chris Urmson ]

This is a very long video (nearly an hour and a half), but it goes quickly, since it’s made up of very short, back-to-back presentations from the 17 last robotics projects funded under the EU’s Horizon 2020 SPARC partnership:

[ Horizon 2020 ]

The Conversation (0)