It’s going to be a long, long time before robots are sophisticated enough that we should worry about them taking over from humans. Having said that, there are things that are simple enough for artificial intelligence systems to learn to solve faster and more effectively than humans can. Like video games. When it comes to video games, humans really are doomed, and you can watch it happen right now.
That, and other videos, because it’s Video Friday.
First Breakout, then the world! Read more at the excellent website below.
[ IEEE Spectrum ]
Forget Jibo. The social robot you want as your home companion and assistant is this Polish bug-eyed robotic head called EMYS, developed at the Wroclaw University of Technology.
[ FLASH ]
SoftBank has delayed the sale of its social robot Pepper for sometime in the summer (this month, only developers will get the robots). But the company released a video showing what Pepper will be able to do and how it expects people to respond to the robot. It’s cute, even if you don’t speak Japanese:
[ Pepper ]
Speaking of cute:
[ Techy Kids ]
Still speaking of cute:
[ Plen ]
I’m not exactly sure why, but watching the Arcturus UAV with its Jump VTOL system take off and land over and over again is somehow very, very soothing.
Maybe it’s the sound.
[ Arcturus UAV ]
In Japan, why would you ever use a lock around a pole when you can rely on a complex robotic system instead?
[ YouTube ]
Not enough Japan for you? Check out the wearable tomato-feeding robot.
Via [ Gizmodo ]
How do you study untrustworthy behaviors in humans? Using a robot, of course!
What are the nonverbal behaviors that constitute a signal related to the trustworthiness of a novel person? The psychology department at Northeastern University (NEU) identified a candidate set of nonverbal cues–face touching, arms crossed, leaning backward, and hand touching–that was hypothesized to be indicative of untrustworthy behavior. However, in order to confirm and further validate such findings, a common practice in social psychology is to employ a human actor to perform certain nonverbal cues and study their effects in a human-subjects experiment. A fundamental challenge inherent in this research design is that people regularly emit cues outside of their own awareness, which makes it difficult even for trained professional actors to express specific cues in a reliable fashion. Our strategy for meeting this challenge was to employ a social robotics platform. By utilizing a humanoid robot, we took advantage of its programmable behavior to control exactly which cues are emitted to each participant. In collaboration with the Social Emotions Lab at NEU, Johnson Graduate School of Management, and Cornell University, we found through a human-subjects experiment that the robot’s expression of the hypothesized nonverbal cues resulted in participants perceiving the robot as a less trustworthy agent.
Just like iCub, I generally prefer to interact with humans while balancing on one foot:
This video shows the latest results achieved in the whole-body control of the iCub, the humanoid robot developed by the Italian Institute of Technology. In particular, it shows the performances of the balancing controller when the robot stands on one foot. The knowledge of the robot dynamics and the measurement of the external perturbations allow for safely interacting with humans as well as controlling highly dynamic motions.
The control of the robot is achieved by regulating the interaction forces between the robot and its surrounding environment. In particular, the force and torque exchanged between the robot's foot and the floor is regulated so that the robot keeps its balance even when strongly perturbed.
These new capacities will be pivotal when iCub will cohabitate with human beings in domestic environments. The results have been achieved by the researches working at the Italian Institute of Technology and, in particular, by those funded by the European Projects CoDyCo and Koroibot with Dr. Francesco Nori as principal investigator.
[ CoDyCo ]
I love this idea: using hydrogen fuel cells, Boeing’s UAV can fly for 8 hours emissions-free on sunlight and water.
[ Boeing ]
Hey, it’s Google Glass doing something useful for once:
For people with extreme disabilities such as ALS and Quadriplegia it is often hard for them to move about on their own and interact with their environments due to their immobility. Our work -- nicknamed "Project Chiron" -- hopes to alleviate some of this immobility with a kit that can be used on any Permobil brand wheelchair.
Let me level with you: if it wasn’t for the music and the fact that this leg is called “SHIZZLE” I probably wouldn’t be including this vid:
CyPhy Works? Given enough snow, CyPhy Plays:
NASA Armstrong (previously NASA Dryden) does a lot of very cool research featuring unmanned systems. We got a tour of the place in 2013, including the model shop where they put together small scale research aircraft. Here’s an overview of what they’ve been up to recently:
[ NASA Armstrong ]
Via [ RoboHub ]
Closing out this week is a CMU Seminar, featuring Michael Wagner from NREC on “Developing Trust in Autonomous Robots”:
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.
Erico Guizzo is the digital product manager at IEEE Spectrum. He oversees the operation, integration, and new feature development for all digital properties and platforms, including the Spectrum website, newsletters, CMS, editorial workflow systems, and analytics and AI tools. He’s the cofounder of the IEEE Robots Guide, an award-winning interactive site about robotics. An IEEE Member, he is an electrical engineer by training and has a master’s degree in science writing from MIT.