Video Friday: One-Legged Hopper, Mini Humanoid, and Robot Heads

Your weekly selection of awesome robot videos

6 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

FLOBI and iCub robot heads
Image: CITEC Bielefeld via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

Cybathlon Symposium – October 07, 2016 – Zurich, Switzerland
Cybathalon 2016 – October 08, 2016 – Zurich, Switzerland
Robotica 2016 Brazil – October 8-12, 2016 – Recife, Brazil
ROSCon 2016 – October 8-9, 2016 – Seoul, Korea
IROS 2016 – October 9-14, 2016 – Daejon, Korea
NASA SRC Qualifier – October 10-10, 2016 – Online
ICSR 2016 – November 1-3, 2016 – Kansas City, Kan., USA
Social Robots in Therapy and Education – November 2-4, 2016 – Barcelona, Spain
Distributed Autonomous Robotic Systems 2016 – November 7-9, 2016 – London, England
AI-HRI – November 17-19, 2016 – Arlington, Va., USA


Let us know if you have suggestions for next week, and enjoy today’s videos.

They should totally sell these at Disney Stores. 

Powered by: The Raibert Hopping Controller™.

“Current and previous single-legged hopping robots are energetically tethered and lack portability. Here, we present the design and control of an untethered, energetically autonomous single-legged hopping robot. The thrust-producing mechanism of the robot’s leg is an actuated prismatic joint, called a linear elastic actuator in parallel (LEAP). The LEAP mechanism comprises a voice coil actuator in parallel with two compression springs, which gives our robot passive compliance. An actuated gimbal hip joint is realized by two standard servomotors. To control the robot, we adapt Raibert’s hopping controller, and find we can maintain balance roughly in-place for up to approx. 7 seconds (19 hops) while continuously hopping.”

[ Disney Research ]

If you have a robot that runs ROS and makes maps, you’ve likely been using GMapping, which gives you very GMapping-y results. Google has just released its own open source ROS-based real-time 2D/3D SLAM library called Cartographer, and it already has support for PR2, TurtleBot, Revo LDS (aka Neato’s lidar), and even TRI’s HSR:

Excellent taste in magazines, TRI! And check out this real-time loop closure:

POW! For more details, read this 2016 ICRA paper

[ Google Research ] via [ TRI ]

Thanks Tully!

“Toyota Motor Corporation plans to launch sales of its compact and cuddlesome ‘Kirobo Mini’ communication partner through Toyota vehicle dealers across Japan in 2017. The nationwide rollout will be preceded by presales at designated dealers in Tokyo and Aichi Prefecture this winter, for which advanced orders will be taken online. Kiboro Mini is tentatively priced at 39,800 yen (excluding tax).”

You had me a “cuddlesome.” You’re looking at about $380 (plus a trip to Japan) if you want one of these little dudes.

[ Toyota ]

The Cybathlon is happening in Switzerland this weekend, and one of the exoskeletons in use will be TWIICE, from EPFL:

“One of TWIICE’s main advantages is that it’s light. Because the device weighs only 14 kilos – it consists mainly of composite materials – it easily becomes one with the user. The hip and knee joints are flexed and extended by two electric motors per leg, and the exoskeleton’s charge lasts for three hours. The device can bear the entire weight of the user, who nevertheless needs crutches to maintain balance and a steady gait. There are buttons in the handles to actuate steps and set the pace: fast walk, slow walk, climb steps, stop, etc.

‘Our goal is to make the vertical world accessible to handicapped people,’ said Mohamed Bouri, a group leader at the LSRO and the project supervisor. ‘In several years, it will undoubtedly be common to see people in exoskeletons standing up and walking around outside or in stores.’ ”

[ EPFL ]

This is YEMO, a “semi-autonomous micro rover” that’s perfectly happy driving around underwater as well as on land:

[ YEMO 1.1 ]

“This video demonstrates the expressive capabilities of our proposed control framework that facilitates human inspired actuation for anthropomorphic robot heads.”

Flobi and iCub, eh? I feel like that’s a relationship that has a lot of potential for strangeness.

[ CITEC Bielefeld ]

I. Love. This. Robot.

“Deltu is a delta robot with a personality that interacts with humans via two iPads. Depending on its mood, it plays with the user who is faced with an artificial intelligence simulation, who appreciates the small pleasures of life, sometimes too much. The relationship we have with robots/AI that have been created to enhance our performance, but have become a source of learning, is unique and exciting. The android’s place in ociety has not yet been defined and remains to be determined; for me it is the best source of inspiration.”

[ Alexia Léchot ]

MIT has been applying materials with programmable bouncy-ness to those tongue-based jumping robots that we covered last year:

[ CSAIL ]

NASA’s SPHERES robots on the ISS are now resilient to being blinded in one eye. Clearly, this is in preparation for some sort of invasion by aliens brandishing pointy sticks:

“During an experiment performed on board of the International Space Station (ISS) a small drone successfully learned by itself to see distances using only one eye. While humans can close one eye and still be able to tell whether a particular object is far, in robotics many would consider it as being extremely hard. ‘It is a mathematical impossibility to extract distances to objects from one single image, as long as one has not experienced the objects before,’ says Guido de Croon from Delft University of Technology and one of the principal investigators of the experiment. ‘But once we recognise something to be a car, we know its physical characteristics and we may use that information to estimate its distance from us. A similar logic is what we wanted the drones to learn during the experiments.’ Only, in an environment with no gravity, where no particular direction is favourite and thus had also to overcome this difficulty.”

[ TU Delft ] via [ ESA ]

The Residence Inn at LAX has been having good experiences with their Relay robot from Savioke:

[ Savioke ]

Here are a pair of videos showing mobile manufacturing robots from Fraunhofer IFF:

The VALERI project, supported by the European Commission under the FP7 “Factories of the Future” Public-Private Partnership, had its final demonstration on October 22nd, 2015 in Airbus DS facilities in Sevilla, Spain. VALERI stands for “Validation of Advanced, Collaborative Robotics for Industrial Applications” and the project aimed to show where mobile manipulators could be used to carry-out non-ergonomic, monotonous tasks, or tasks which re-occur through the entire production process. Mobile manipulators need to be able to work side by side next to humans, without separating barriers between them, to allow both humans and robots to be able to complete their work independent of the other, while still working in close proximity or even on the same parts. The project focused on three exemplary tasks namely applying sealant along a groove, inspecting the applied sealant, and inspecting braided carbon parts. While the first two applications are closely related, the VALERI consortium chose a third application to demonstrate the overall flexibility of the system. The robot is able to change tools and the programming for carrying out completely new processes can be done quickly and intuitively.

“ ‘Annie’ is a research platform for mobile manipulation, developed by the Fraunhofer IFF in Magdeburg, Germany. The video demonstrates some basic skills and technologies for simple assembly tasks. The robot uses the the multi-sensor light-field camera integrated in its head to detect and localize objects. The camera system was developed in the project ISABEL (www.projekt-isabel.de). Furthermore, a new framework for the integration of intelligent IoT devices is used, which allows the control of the screwdriver via WLAN.”

[ Fraunhofer IFF ]

PR2 can change tires now?

“Learning Object Orientation Constraints and Guiding Constraints for Narrow Passages from One Demonstration,” by Changshuo Li and Dmitry Berenson, was just presented at the International Symposium on Experimental Robotics (ISER) in Tokyo.

[ ISER ]

I do not know what this is but it’s cute and was at CEATEC in Japan:

[ Kazumichi Moriyama ]

Here’s Christoph Bartneck with an interesting perspective on HRI, exploring material challenges like the importance of what your robot is actually made out of when it comes to human interaction.

“Physical interaction is the defining attribute for human-robot interaction. The haptic qualities of a robot are essential for its success. A robot must feel right. Currently robot developers design from the inside out. They first develop the robot’s interior before adding a shell around it.

While this approach might be suitable for industrial application, it does not fit the requirements for human-robot interaction. We need to design robots from the outside in. First we need to design its appearance and haptic attributes. The material challenges are to develop materials that feel right for a robot. This often means hiding a hard core in a soft shell. This talk tries to define parameters for the materials used in human-robot interaction.”

[ Christoph Bartneck ]

This week’s CMU RI seminar comes from Umamaheswar Duvvuri on Surgical Robotics: past, present, and future.

“The advent of robotic systems to medicine has revolutionized the practice of surgery. Most recently, several novel robotic surgical systems have been developed are entering the operative theater. This lecture will describe the current state-of-the-art in the robotic surgery. We will also describe some of the newer systems that are currently in use. Finally, the future of robotic surgery will be described in the context of clinical development and ease of use in the operating theaters of the future.”

[ CMU RI ]

The Conversation (0)