Robots Made Out of Branches Use Deep Learning to Walk

Researchers used deep reinforcement learning to teach these strange robots how to move

Robot Made Out of Branches Uses Deep Learning to Walk
Photo: Azumi Maekawa
Researchers from Preferred Networks built robots out of unusual materials like tree branches and used deep reinforcement learning to develop locomotion algorithms for them.
Advertisement

Designing robots is a finicky process, requiring an exhaustive amount of thought and care. It’s usually necessary to have a very clear idea of what you want your robot to do and how you want it to do it, and then you build a prototype, discover everything that’s wrong with it, build something different and better, and repeat until you run out of time and/or money.

But robots don’t necessarily have to be this complicated, as long as your expectations for what they should be able to do are correspondingly low. In a paper presented at a NeurIPS workshop last December, a group of researchers from Preferred Networks experimented with building mobile robots out of a couple of generic servos plus stuff you can find on the ground, like tree branches. 

These robots figure out how to walk in simulation first, through deep reinforcement learning. The way this is implemented in the paper is by picking up some sticks, weighing and 3D scanning them, simulating the entire robot, and then rewarding gaits that result in the farthest movement. There’s also some hand-tuning involved to avoid behaviors that might (for example) “cause stress and wear in the real robot.” 

Overall, this is maybe not the kind of strategy that you’d be able to use for most applications, but we can speculate about how robots like these could become a little bit more practical at some point. The idea of being able to construct a mobile robot out of whatever is lying around (plus some servos and maybe a sensor or two) is a compelling one, and it seems like you could develop a gait from scratch on the physical robot using trial and error and feedback from some basic sensors, since we’ve seen similar things done on other robotic platforms.

Found materials robots like these are not likely to be as capable as traditional robotic designs, so they’d likely only be useful under special circumstances. Not having to worry about transporting structural materials would be nice, as would being able to create a variety of designs as necessary using one generalized hardware set. And building a robot out of locally available materials means that anything you put together will be really easy to fix, even if you do have to teach it to move all over again.

Improvised Robotic Design With Found Objects,” by Azumi Maekawa, Ayaka Kume, Hironori Yoshida, Jun Hatori, Jason Naradowsky, and Shunta Saito, from Preferred Networks, Inc., was presented at the Workshop on Machine Learning for Creativity and Design at NeurIPS 2018.

[ Azumi Maekawa ] via [ HY-MA ]

Thanks Fan!

Robotics News

Biweekly newsletter on advances and news in robotics, automation, control systems, interviews with leading roboticists, and more.

About the Automaton blog

IEEE Spectrum’s award-winning robotics blog, featuring news, articles, and videos on robots, humanoids, automation, artificial intelligence, and more.