Video Friday: Robot Sword Fights, MIT Basement Racing, and RoboGames

Terrible idea: teaching robots to use swords. Not terrible idea: spending the rest of your Friday watching robot videos

6 min read

Video Friday: Robot Sword Fights, MIT Basement Racing, and RoboGames

I will never, ever, ever understand why robotics researchers seem to feel the need to persist in teaching their robots how to use swords, of all things. First Georgia Tech, then Stanford, and now Namiki Lab in Japan is doing it with robot arms that can move faster than you can. At some point, this is all going to go horribly wrong, but until that happens, we can enjoy the videos. And we have lots of videos this week, what with the perfect storm of National Robotics Week, RoboGames, and it being Friday. Here we go! 

We propose a sword-fighting robot system controlled by a stereo high-speed vision system as an example of human-robot dynamic interaction systems. The developed robot system recognizes both of the positions of a human player and that of the sword grasped by the robot hand. And it detects the moment when the human starts to move by using ChangeFinder which is a method of detecting the turning points. Next it predicts the possible trajectories of the sword of the human player by a least-squares method from the moment when the attack started. Finally it judges the kinds of the attack and generates an appropriate defensive motion. Experimental results verify the effectiveness of the proposed algorithm.

In recent years, various robot hands and arms have been developed for achieving dexterous manipulation tasks. However, there are few robots that are able not only to move quickly but also to handle tools dexterously. Motion in the Japanese game kendama is one example of dynamic manipulation and skillful handling. Although robotic kendma has been studied in the past, these robotic hands cannot be used effectively. The purpose of this study was to achieve kendama motion by estimating the object to be grasped based on a high-speed vision system and CoP tactile sensors. Our robot successfully performed the catching motion in kendama.

Paper folding is one of the most difficult tasks for multi-fingered robot hands because paper is deformable and its stiffness distribution is nonuniform. In this study, we aim to achieve dexterous paper folding by extracting some dynamic primitives. Each primitive uses visual and force information, a physical model of a paper sheet for analyzing its deformation, a machine learning method for predicting its future state. In this paper, we propose a strategy to achieve valley folds of a sheet of paper twice in a row. In the second fold, a crease line of the first fold disturbs accuracy of the folding. We propose some new manipulation techniques to solve the problem. Finally we show demonstrations of the paper folding achieved with high success rate.

Namiki Laboratory ]

RoboGames. RoboGames? RoboGames! We live very very far away from RoboGames, but Make doesn’t:

We do have some exclusive(ish) footage of one particular robot that we’ve always been big fans of: Counter Revolution, a heavyweight robot that made its first appearance in 2009. It was designed and built by a bunch of people who were working at Willow Garage at the time, including Curt Meyers, Melonee Wise, Derek King, Dallas Goecker, and Michael Gregg. Six years later, Counter Revolution is back, and just as deadly, except this time maybe slightly more deadly to other robots and slightly less deadly to itself:

[ RoboGames ]

Thanks Michael and Dallas!

National Robotics Week is STILL GOING ON, and it’s inspired some of the world’s greatest roboticists (who are all at Georgia Tech, obviously) to talk about the future of robotics. There’s five vids in the playlist, so make sure you click through all of ‘em:

[ Georgia Tech ]

I don't know what passes for “education” at MIT these days, because “learning stuff” now involves teaching autonomous R/C cars to race around basement hallways:

The robot chassis was based on a 1:10-scale radio-controlled racecar modified to accept onboard control of its steering and throttle actuators. To perceive its motion and the local environment, the robot was outfitted with a heterogeneous set of sensors, including a scanning laser range finder, camera, inertial measurement unit, and visual odometer. Sensor data and autonomy algorithms were processed on board with an NVIDIA Jetson Tegra K1 embedded computer. The Tegra K1 processor features a 192-core general-purpose graphics processing unit. The MIT IAP activity was one of the first to integrate the emerging embedded supercomputers into an educational event.

The robot’s software leverages the Robot Operating System (ROS) framework to facilitate rapid development. ROS is a collection of open-source drivers, algorithms, tools, and libraries widely used by robotics researchers and industry. Students integrated existing software modules, such as drivers for reading sensor data, alongside their custom algorithms to rapidly compose a complete autonomous system.

[ RACECAR ] vs [ MIT ]

Amazon’s picking challenge is going to be soundly trounced at ICRA 2015 if Team PickNik has anything to say about it:

[ Team PickNik ]

[ Amazon Picking Challenge ]

In the Netherlands, it’s traditional to roast the first asparagus of the season underneath the flaming carcass of the delivery drone that was supposed to drop it off at a fancy restaurant:

[ YouTube ]

Meanwhile, SenseFly is making agricultural drones that are somewhat less likely to crash and explode:

[ SenseFly eBee ]

YuMi, from ABB, is a tasty-sounding collaborative robot that ABB will officially release next week:

With the introduction of YuMi, the world’s first truly collaborative dual-arm industrial robot, ABB Robotics is once again pushing the boundaries of what robotic automation will look like in the future and how it will fundamentally alter the types of industrial processes that can be automated with robots.

A play on words, YuMi signifies “you” and “me” creating an automated future together. This groundbreaking solution is the result of years of research and development, heralding a new era of robotic coworkers that are able to work side-by-side on the same tasks as humans while still ensuring the safety of those around it. 

Few production arenas are changing as quickly as small parts assembly. The electronics industry, in particular, has seen demand skyrocket past the supply of skilled labor. As conventional assembly methods diminish in value, manufacturers are finding it strategically and economically imperative to invest in new solutions.

While YuMi was specifically designed to meet the flexible and agile production needs required in the consumer electronics industry, it has equal application in any small parts assembly environment thanks to its dual arms, flexible hands, universal parts feeding system, camera-based part location, lead-through programming, and state-of-the-art precise motion control.

[ YuMi ]

NSF checks out the UCLA Biomechatronics Lab to see how they’re incorporating a sense of touch into prosthetics:

[ UCLA ]

DARwIn-OP checks out NSF to see how they’re nationally foundationing science:

[ NSF ]

With the DRC Finals coming up in June, a few of the braver teams have started posting videos of their progress. Looking good! Mostly!

[ MIT ]

[ KAIST ]

[ Grit Robotics ]

[ DRC Finals ]

Thanks Mike!

Tech United Eindhoven is preparing for a robotic soccer competition in Portugal. This (very short) vid is worth watching just for the last little bit at the end, where a few humans take on the robots. I love watching the robots adaptively swarm around the humans, even as they try to take one of the poor dudes out completely:

[ Tech United Eindhoven ]

To celebrate the Opportunity Mars rover having traveled an entire marathon on the red planet (26.2 miles over 11ish years), JPL decided to host their own marathon, to be run in a slightly shorter amount of time. And you'll never guess who wins:

I would like to point out that JPL is build kind of on a massive hill, and I feel bad for all these people. And jealous, because that looks like fun.

[ Oppy ]

If you haven’t seen a robot making a map with ROS before, this is pretty cool, especially since it’s on a hexapod:

[ Rhoeby ]

Looks like 3DRobotics might have a new quadcopter. It may have been designed by an infinite number of monkeys. At least, I think that’s what this video is saying:

[ 3DRobotics ]

With Pepper and a Myo wristband, the possibilities are limitless. Or, as limitless as this:

[ OTL ]

This video presents an overview of the hardware and the control applications of the TOrque-controlled humanoid RObot TORO developed at DLR. Applications include: compliant behavior for the whole robot, energy-based limit cycle controller for hand shaking, walking on flat ground, balancing on a rockerboard, and multi-contact whole-body control.

I love the sneakers at 2:15.

[ DLR ]

Crabster 200 has learned how to, uh, fap! I think they mean “flap.” I hope they do.

[ KIOST ]

The follow-up to that Prallplatte video from last week gets, um, creepy:

[ Prallplatte ]

Dr. Angelica Lim, who is awesome for lots of reasons not least of which is the fact that she writes stuff for us sometimes, gave an excellent talk at TEDxKL on Robots, Emotions & Empathy:

And that Nao unboxing video from 2011? 166k views.

[ TEDx ]

Last this week: UT Austin robotics professor Luis Sentis visited Georgia Tech a few weeks ago to give a talk on Humanoids of the Future. The video is a little, er, trippy, but if you love robotics enough, you won’t care:

[ Human Centered Robotics Lab ]

The Conversation (0)