Next Friday is Halloween! This means that you have just seven days to decide what robot you want to dress up as, along with what robots you want your robots to dress up as when they go trick or treating. My TurtleBot is going as a Dalek, my Pleo is going as a Paro, my AR Drone is going as an illegal autonomous quadrotor, my Roomba is going as a Neato, and my Sphero is going as a cube. And as for me, well, I’m going as a Geminoid: costumes don’t get much easier than that, right?
Let’s see what we can do to inspire you with our regular Friday full of robot videos.
Last week, we wrote about how Audi’s robotic racecar was going to take a full-speed lap on a track in German. It managed to do so without crashing itself (not that we’re surprised), and Audi’s got highlights from the lap:
And a longer version, with more technical detail:
[ Audi ]
University of Freiburg has provided an excellent reason for everybody to get a Nao: it can autonomously clean up a room, hooray!
The robot observes the scene at the start and then plans symbolic actions in order to pick up all objects and transport them to their container. The robot hereby continuously observes the environment and monitors its state to react to unforseen changes and to perform replanning as needed.
[ Freiburg ]
The best kind of art is art that demands to be seen, by chasing you down and making you notice it, if that’s what it takes:
And here’s how it was made. Can you spot the robot?
In order for Northrop Grumman’s MQ-8C Fire Scout to land on a rolling ship, it’s first going to have to master landing on a static slope, which it seems pretty good at so far:
[ Fire Scout ]
At the DRC Trials, we got used to a very, very slow pace. But things are speeding up in preparation for the Finals next year:
Using localization we are able to send a single footstep plan to the robot telling it to walk over the cinder blocks and repeat without adjusting any footsteps. In this video we show atlas walking over and back 10 times without any operator involvement.
[ IHMC Robotics ]
It’s not really practical to field teams of 11 Nao robots for RoboCup competitions, which is one of the reasons why RoboCup has a simulation league. UT Austin Villa were the undisputed champions in 2014, with 52 goals scored and zero goals conceded. Here are match highlights:
A key component to the team's success was the use of kick anticipation (having players broadcast to their teammates where they are kicking the ball so that their teammates can run toward that location) with long distance kicking for passing and shooting.
[ UT Austin ]
Kilobots have been gradually developing a bunch of discrete swarm behaviors, and by combining these behaviors, the robots are able to coordinate with each other to form patterns of lights, and even display pictures and letters:
A composition showing various collective behaviors programmed on a Kilobot swarm. Phototaxis: each robot uses a single sensor to move up the gradient of light. Gradient formation: information propagates from a single source robot, other robots color themselves red, blue, or green, modulo their distance from the source in information hops. Synchronization: each robot has an oscillator and changes the phase to match its local neighbors until the whole system converges. Dynamic Pattern Formation: the robots use distributed triangulation and a few seed robots to form a global coordinate system for pattern formation. This is combined with synchronization to allow the group to flash the words "Hello World".
[ Kilobots ]
Over time, robots (especially complex ones) get out of calibration. Calibrating them over and over is a pain, so ideally, they'd be able to calibrate themselves. They can do this by (say) holding up cards with patterns on them to check joint angles, or some of them can do it through touch:
This video deals with the task of solving the problem of self - (or double-) touch. This behaviour is defined as the robot being able to touch itself on a specific region of the skin, and it represents an unprecedented opportunity for a humanoid robot to achieve the simultaneous activation of multiple skin parts (in this video, patches belonging to the right hand and the left forearm). This high amount of information can be then used later on for exploiting a completely autonomous calibration of the body model (i.e. kinematic calibration or tactile calibration). In the reference paper cited below, this competence has been used to perform a kinematic calibration of both the right and the left arms.
“Automatic Kinematic Chain Calibration Using Artificial Skin: Self-Touch in the iCub Humanoid Robot,” by A. Roncone, M. Hoffmann, U. Pattacini, and G. Metta, from the Italian Institute of Technology, was presented earlier this year at ICRA 2014 in Hong Kong.
Haven’t you always wanted to know all of the things that you should absolutely NOT do with a PR2? Here’s an old video from Willow Garage discussing PR2 safety. It's funny (intentionally funny), and is also worth watching to see if you can recognize all of the young-looking Willow Garagers. It’s been nearly half a decade, which seems like both a very long time, and not a very long time at all.
[ Willow Garage ]
We’ll close the week with a series of videos from AUVSI about the first ever Maritime RobotX Challenge, which is taking place in Singapore through the weekend. I’d explain how it works, but AUVSI has put together all of this excellent footage, so I’ll just let them do it. The following vids take us through the intro and practice days, and we'll get you the competition itself next week.