Video Friday: Walking the XDog, Muscle-Powered BioBots, and Rollin’ Justin Will Clean Your Kitchen

Your weekly selection of awesome robot videos

9 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Video Friday: Walking the XDog, Muscle-Powered BioBots, and Rollin’ Justin Will Clean Your Kitchen
You missed a spot, Justin.
Photo: DRL

Video Friday is your weekly selection of awesome robotics videos, collected by your mysophobic Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RobArch 2016 – March 14-19, 2016 – Sydney, Australia
European Robotics Forum – March 21-23, 2016 – Ljubljana, Slovenia
RoboCup European Open – March 30-4, 2016 – Eindhoven, Netherlands
WeRobot 2016 – April 1-2, 2016 – Miami, Fla., USA
National Robotics Week – April 2-10, 2016 – United States
AISB HRI Symposium – April 5-6, 2016 – Sheffield, United Kingdom
Robotics in Education 2016 – April 14-15, 2016 – Vienna, Austria
NASA Swarmathon – April 18-22, 2016 – NASA KSC, Fla., USA
LEO Robotics Congress – April 21, 2016 – Eindhoven, Netherlands
International Collaborative Robots Workshop – May 3-4, 2016 – Boston, Mass., USA
ICARSC 2016 – May 4-6, 2016 – Bragança, Portugal
Robotica 2016 – May 4-8, 2016 – Bragança, Portugal
ARMS 2016 – May 9-13, 2016 – Singapore
ICRA 2016 – May 16-21, 2016 – Stockholm, Sweden
NASA Robotic Mining Competition – May 18-20, 2016 – NASA KSC, Fla., USA
Skolkovo Robotics Conference – May 20, 2016 – Skolkovo, Russia
Innorobo 2016 – May 24-26, 2016 – Paris, France
RoboCity16 – May 26-27, 2016 – Madrid, Spain
RoboBusiness Europe – June 1-3, 2016 – Odense, Denmark
IEEE RAS MRSSS 2016 – June 6-10, 2016 – Singapore
CR-HRI – June 6-10, 2016 – Orlando, Fla., USA


Let us know if you have suggestions for next week, and enjoy today’s videos.

Aww, can we take it for a walk?

XDog is a small electric quadruped designed and built by Xing Wang, a graduate student at Shanghai University, with support from his adviser Jia Wenchuan. The robot has 12 motors (each leg has 3 DoF), and uses force sensors on each foot, IMU, and joint-angle sensors for control. Wang says XDog “has just been born,” and he’s still working on its walking gait (top speed is currently about 0.6 m/s) and hopes to add capabilities like running and jumping. And if you noticed all the Boston Dynamics references in the video, yes, Wang is a fan: 

Marc Raibert from Boston Dynamics is my idol,” he says. “Their papers helped me to design XDog.” He added: “But Boston Dynamics quadruped robots are large and expensive, and I want to use [a different design] to make quadruped robots simpler and smaller, so that they can help ordinary people with things like carrying objects or as companions.”

[ XDog ]

Hilton is experimenting with a new robotic concierge by stuffing IBM’s Watson into the tiny plastic skull of a NAO:

Named after Conrad Hilton, Connie uses a combination of Watson APIs – Dialog, Speech to Text, Text to Speech and Natural Language Classifier – along with WayBlazer’s extensive travel domain knowledge, to interact with guests. Interested in knowing where the hotel pool is? Connie will make sure you are heading to the right floor. Want to know where a local sushi restaurant is? Connie can pull up recommendations within walking distance. Connie learns and builds its knowledge base through each guest interaction.

Too bad the video is 98 percent fluff. It would be cool to see Connie actually, you know, doing something.

[ IBM ]

From the HIT Lab at the University of Canterbury in New Zealand:

Robots could be useful in order to preserve important cultural human material. We feel a deep love and respect for the Maori traditions. We want to contribute to the preservation of this cultural heritage. Using our expertise in social robotics, we programmed the NAO robots in order to display the Maori Haka. We hope that people can know and appreciate the value of the New Zealand’s culture through the robots too.

NAO needs to be way, way scarier to pull that off properly.

[ HIT Lab ]

Remember that crazy idea of building tiny walking “biobots” powered by muscle cells from mice’s hearts and legs? Apparently it’s working:

The tiny BioBots engineered at one NSF-funded Science and Technology Center (STC) move a bit like inchworms, but they represent giant strides in science and engineering. They can be controlled with electrical or optical signals and use muscle tissue for power. The mission of the STC on Emergent Behaviors of Integrated Cellular Systems (EBICS) is to develop the science and technology needed to engineer clusters of living cells. This will eventually help mankind address challenges in health, security and the environment.

EBICS researchers at the forefront of this novel and multidisciplinary field are committed to sharing responsible and ethically conscious practices for forward engineering biological machines. Currently, researchers are focused on BioBots that mimic the body, but, perhaps one day, biological machines could replace animals for drug testing, or be used to detect and neutralize toxins in the environment or even sequester carbon dioxide (CO2) from the atmosphere.

[ NSF ]

An amputee was able to feel smoothness and roughness in real-time with an artificial fingertip that was surgically connected to nerves in his upper arm. Moreover, the nerves of non-amputees can also be stimulated to feel roughness, without the need of surgery, meaning that prosthetic touch for amputees can now be developed and safely tested on intact individuals.

“The stimulation felt almost like what I would feel with my hand,” says amputee Dennis Aabo Sørensen about the artificial fingertip connected to his stump. He continues, “I still feel my missing hand, it is always clenched in a fist. I felt the texture sensations at the tip of the index finger of my phantom hand.”

Nerves in Sørensen’s arm were wired to an artificial fingertip equipped with sensors. A machine controlled the movement of the fingertip over different pieces of plastic engraved with different patterns, smooth or rough. As the fingertip moved across the textured plastic, the sensors generated an electrical signal. This signal was translated into a series of electrical spikes, imitating the language of the nervous system, then delivered to the nerves.

Sørensen could distinguish between rough and smooth surfaces 96 percent of the time.

[ EPFL ]

A very cool demo from ASL at ETH Zurich, showing a quadrotor doing dynamic autonomous navigation around obstacles at 4 Hz:

Next, a real forest, please.

[ ASL ]

Echo Voyager, Boeing’s latest unmanned undersea vehicle (UUV), can operate autonomously for months at a time thanks to a hybrid rechargeable power system and modular payload bay. The 51-foot-long vehicle is the latest innovation in Boeing’s UUV family.

If I had to choose one word to describe Eco Voyager, it would probably be “submarine,” not “awesome.”

[ Boeing ]

This squishy industrial robot (it’s covered in foam) didn’t crush me when I visited Fanuc last week, for which I am most grateful. It’s very sensitive to forces, and is ISO certified to work with people, even with its 35-kg payload.

[ Fanuc ]

We covered TU Delft’s awesome space haptics stuff a while back, and here’s an excellent overview of their research.

[ TU Delft ]

This is McGill Robotics’ critical design presentation for the 2016 University Rover Challenge. Introducing Bhūmi, our latest Mars rover!

McGill’s rover will be competing in the Mars Society’s University Rover Challenge this June.

[ McGill Robotics ]

Momaro managed to be (probably) one of the least expensive robots to compete in the DRC Finals, but its innovative design of leg-wheels powered it to a fourth place finish. Here’s how its creators, the NimbRo Rescue team at the University of Bonn, did it:

[ NimbRo Rescue ]

Rollin’ Justin was a bit of a slacker. It used to spend a lot of time learning cool (but not very useful) tricks. We’re happy to report that the DLR humanoid is now ready to become a productive member of society, beginning with learning how to clean our cars and kitchens.

Universal robotic agents are envisaged to perform a wide range of manipulation tasks in everyday environments. A common action observed in many household chores is wiping, such as the absorption of spilled water with a sponge, skimming breadcrumbs off the dining table, or collecting shards of a broken mug using a broom. To cope with this versatility, the agents have to represent the tasks on a high level of abstraction.

In this work, we propose to represent the medium in wiping tasks (eg water, breadcrumbs, or shards) as generic particle distribution. This representation enables us to represent wiping tasks as the desired state change of the particles, which allows the agent to reason about the effects of wiping motions in a qualitative manner. Based on this, we develop three prototypical wiping actions for the generic tasks of absorbing, collecting and skimming. The Cartesian wiping motions are resolved to joint motions exploiting the free degree of freedom of the involved tool. Furthermore, the workspace of the robotic manipulators is used to reason about the reachability of wiping motions.

We evaluate our methods in simulated scenarios, as well as in a real experiment with the robotic agent Rollin’ Justin.

“Robotic Agents Representing, Reasoning, and Executing Wiping Tasks for Daily Household Chores,” by Daniel Leidner, Wissam Bejjani, Alin Albu-Schäffer, and Michael Beetz, was published in Proc. of the International Conference on Autonomous Agents and Multiagent Systems (AAMAS), Singapore, May 2016.

[ DLR ]

We think it is possible to build safe cargo drone routes connected by cheap droneports across much of the planet. Quiet, beautiful, goose-like craft will lift off from the droneports carrying precious cargo along fixed routes in the lower sky, saving lives and creating jobs at massive scale. Here is the first concept for a droneport we would like to build in Rwanda. It will cost about the same to build as a petrol station.

This is what we should be using delivery drones for: important stuff in remote places, where they’re the only real option. Now enough of CGI and let’s make this happen, people.

[ Vimeo ] via [ PopSci ]

I know nothing about this besides the fact that it comes from Seoul National University and is freakily cool:

[ SNU ]

Undergraduates working in Kod*lab are important to the lab’s success in building robots and advancing research. The students’ experiences have often led to undergraduate-authored refereed publications and exciting careers in robotics. Kod*lab’s recent undergraduate research assistants, Shafag Idris ’15, Electrical and Systems Engineering, and Justin Starr ’15, Mechanical Engineering and Applied Mechanics, share their experiences and insight while working in Kod*lab and post-graduation plans.

[ Kod*lab ]

Well, I guess there’s one reason to go to New Jersey... Andy Shen brought one of his Nerf disc-shooting drones to Liberty Science Center:

Motherboard talked to Andy about these things last year. Right now, the frame and hardware to turn your drone into a deadly(ish) weapon is just $150.

[ Shendrones ] via [ Shapeways ]

This particular video of Yaskawa robots putting stuff on pallets isn’t particularly interesting, except that if you have a VR headset (like Google Cardboard), you can check it out in 3D.

[ Motoman ]

Here are some videos showing interesting research from Vikash Kumar over at the University of Washington. The first video, of UW’s 24-DoF ADROIT hand, is especially interesting:

This video describe results from a method for learning dexterous manipulation skills with a pneumatically-actuated tendon-driven 24-DoF hand. The method combines iteratively refitted time-varying linear models with trajectory optimization, and can be seen as an instance of model-based reinforcement learning or as adaptive optimal control. Its appeal lies in the ability to handle challenging problems with surprisingly little data. We show that we can achieve sample-efficient learning of tasks that involve intermittent contact dynamics and under-actuation. Furthermore, we can control the hand directly at the level of the pneumatic valves, without the use of a prior model that describes the relationship between valve commands and joint torques. We compare results from learning in simulation and on the physical system. Even though the learned policies are local, they are able to control the system in the face of substantial variability in initial state.

Data-driven methods have lead to advances in multiple fields including robotics. These methods however have had limited impact on dexterous hand manipulation, partly due to lack of rich and physically-consistent dataset as well as technology able to collect them. To fill this gap, we developed a virtual reality system combining real-time motion capture, physics simulation and stereoscopic visualization. The system enables a user wearing a CyberGlove to “reach-in” the simulation, and manipulate virtual objects through contacts with a tele-operated virtual hand. The system is evaluated on a subset of tasks in the Southampton Hand Assessment Procedure – which is a clinically validated test of hand function. The system is also being used by performer teams in the DARPA Hand Proprioception & Touch Interfaces program to develop neural control interfaces in simulation.

[ UW ]

Thanks Vikash!

And finally, from the CMU Robotics Institute seminar series: M. Ani Hsieh, Associate Professor, Drexel University

Exploiting the Environment to Improve Autonomy: Robots in Geophysical Flows Different from many aerial and ground robots, underwater robots operate in a communication and localization-limited environment where their dynamics are tightly coupled with the environmental dynamics. While the tight-coupling between vehicle and environment dynamics makes control challenging, it provides a unique opportunity for robots to exploit the environmental forces to improve and prolong their autonomy. In this talk, I’ll show the limitations of existing air and ground based strategies and present our efforts in improving vehicle autonomy by better understanding the dynamics of the geophysical fluid environment. The talk will describe our efforts in using robot teams to track coherent structures. Coherent structures are of great importance since they give us a way to map and represent the dynamics of the fluid environment. I will then show how this information can then be exploited to develop more efficient control and coordination strategies for networks of AUVs/ASVs operating in these environments.

[ CMU RI Seminar ]

The Conversation (0)