You know what special and exciting thing we have going on today? Not a one. We’re hosting a perfectly normal, reasonable, non-crazy Video Friday, because it seems like we haven’t had one of those in a while. Besides, there are, of course, things happening soon, like the 2015 Bay Area Robotics Symposium next week, followed by AUVSI Unmanned Systems Defense 2015 the week after that. But this week, you don’t have to worry about that. All you have to do is enjoy this delicious selection of soothingly straightforward robot videos.
This new soft robotic tentacle from Cornell is exciting because it’s been 3D printed in order to recreate the structure of muscles in a real octopus tentacle. The really wiggly part starts at about 1:25, so feel free to skip to that.
[ Cornell ]
Researchers (artists?) at Purdue put some soybean plants onto iRobot Create bases, and then had the robots seek out sunlight or LEDs to keep their plant passengers happy:
That’s all well and good for the soybeans, but I want a robot that will do this for me.
Did you know that seals can track prey underwater even when they’re wearing blindfolds and earmuffs, thanks to their sensitive whiskers? Also, did you know that they made earmuffs for seals? How amazing is that? Only slightly less amazing is that researchers at MIT are developing whiskers to help robots navigate underwater:
Also, seals are super cute.
[ MIT ]
This video from NimbRo at the University of Bonn shows “3D environment perception, localization, obstacle avoidance, and autonomous mission execution of a micro aerial vehicle in a complex 3D indoor environment.” The best part is by far the visualization of the lidar.
Well, I found this on YouTube, and it’s a big robot climbing a ladder, and that’s what I know about it.
[ YouTube ]
The future on Mars is humans and robots working together. This is according to NASA, which probably has a better idea of what the future of Mars is all about than anyone else:
[ NASA Ames ]
From Cambridge Consultants:
“Our demonstration of the technology has fruit stacked randomly in a bowl – with our robot using machine vision and some smart software to identify which piece of fruit is on top. It translates this information into real-world co-ordinates and positions the ‘hand’ to pick the required fruit, whilst avoiding other objects. The custom-made hand adapts to the shape of the fruit and securely grips it without damaging it. Once picked, the fruit can also be sorted by colour so that, for example, red apples can be separated from green apples.”
Hooray! Hooray for fruit!
How many drones can you control all at once? Lots of drones. Lots of them.
SUCK IT, DRONES!
This works by disrupting a drone’s comms, so what you’re seeing in the video is the drone adopting its “I’m lost and confused” protocol, which in this case is a safe descent. In other words, defeating this thing with your drone only requires making it autonomous, so better get on that, right?
Also, is this FCC approved?
In 7 seconds, watch an algorithm design a quadrotor frame:
This is an experiment in generative design from Autodesk, where you tell a computer what materials you have and what you want your quadrotor to be able to do, and it grows a structure for you.
Unmanned technology with the potential to change the face of naval operations within a decade has successfully been demonstrated for the first time by BAE Systems in partnership with ASV at a site near Portsmouth Naval Base. The new system will allow crews to carry out vital tasks such as high speed reconnaissance and remote surveillance while keeping sailors out of harm’s way.
Every time I find one of these military demo videos, the music is like twice as dramatic as the time before. It’s impressive, really, but it makes me wonder what the endpoint is: just one continuous explosion?
[ BAE Systems ]
The ARCAS Project “proposes the development and experimental validation of the first cooperative free-flying robot system for assembly and structure construction.” They’re not quite there yet, but here’s how far they’ve gotten over the last three years:
[ ARCAS Project ]
Toby Walsh, along with Stuart Russell and Max Tegmark, was one of the authors of the open letter calling for a ban on autonomous weaponized robots. We’ve had a bunch of, er, lively discussions about this, and Toby Walsh recently gave a talk at TEDxBerlin that’s worth watching.
[ TEDxBerlin ]
Tully Foote, hovercraft builder and ROS Platform Manager at the Open Source Robotics Foundation, talks to “Robots in Depth” about open source, ROS, and lots of other stuff.
[ Robots in Depth ]
And we’ll end with . . . “Multimodal Machine Learning: Modeling Human Communication Dynamics,” a talk by Louis-Philippe Morency from CMU.
“Human face-to-face communication is a little like a dance, in that participants continuously adjust their behaviors based on verbal and nonverbal cues from the social context. Today's computers and interactive devices are still lacking many of these human-like abilities to hold fluid and natural interactions. Leveraging recent advances in machine learning, audio-visual signal processing and computational linguistic, my research focuses on creating computational technologies able to analyze, recognize and predict human subtle communicative behaviors in social context. I formalize this new research endeavor with a Human Communication Dynamics framework, addressing four key computational challenges: behavioral dynamic, multimodal dynamic, interpersonal dynamic and societal dynamic. Central to this research effort is the introduction of new probabilistic models able to learn the temporal and fine-grained latent dependencies across behaviors, modalities and interlocutors. In this talk, I will present some of our recent achievements modeling multiple aspects of human communication dynamics, motivated by applications in healthcare (depression, PTSD, suicide, autism), education (learning analytics), business (negotiation, interpersonal skills) and social multimedia (opinion mining, social influence).”
[ CMU ]