Video Friday: Japan's Avatar Robot, Lidar vs. Camera, and Knicks' Drone Show

Your weekly selection of awesome robot videos

6 min read
Japan's avatar-robot teleoperated system
Photo: Keio University

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):

Humanoids 2017 – November 15-17, 2017 – Birmingham, U.K.
iREX 2017 – November 29-2, 2017 – Tokyo, Japan

Let us know if you have suggestions for next week, and enjoy today’s videos.

Takahiro Nozaki and colleagues of the Faculty of Science and Technology and Haptics Research Center at Keio University developed a haptic-based avatar-robot with a General Purpose Arm (GPA) that transmits sound, vision, movement, and importantly, highly sensitive sense of touch (force tactile transmission), to a remotely located user in real time. “This ‘real-haptics’ is an integral part of the Internet of Actions (IoA) technology, having applications in manufacturing, agriculture, medicine, and nursing care,” says Nozaki.

This is the world’s first high precision tactile force transmission technology that remembers human movements, edits them, and reproduces them. Also, this arm does not employ conventional touch sensors, thereby making it cheaper, more compact, and robust with respect to malfunction and noise. The core technology behind this avatar-robot is based on high precision motors integrated in the avatar arm and algorithms to drive them. High precision control of force and position is critical for transmitting a sense of touch without using touch sensors.

[ Keio University ]

Here’s an interesting side-by-side video from Velodyne comparing one of their VLP-32s to a camera in the real world. Not that Santana Row in San Jose is representative of the real world (if you’ve been there, you know what I’m talking about), but still.

Our VLP-32C mounted on top of a Ford Fusion picks up the details in the bustle at Santana Row outdoor shopping mall. Our LiDAR sensor delivers hundreds of thousands of data points per second containing distance measurements of nearby vehicles, pedestrians, and traffic signs with accuracy.

Velodyne ]

Verity Studios is responsible for the drone show that’s part of the beginning of a sportsball event:

Slick move, catching those drones at the end there.

Verity Studios ]

Kevin Knoedler, winner of the NASA Space Robotics Challenge (SRC), visited the Northeastern UMass Lowell team to test his approach from simulation on the real Valkyrie robot. The video shows Kevin controlling the robot with the interface and autonomy he developed to turn the wheels in the first SRC task, aligning a communications dish. In the simulated environment, these wheels control the elevation and azimuth of a communications dish. In addition, the team’s virtual reality interface is tested on the same task setup.

[ UMass Lowell ]

Thanks Velin!

Happy Holidays? It’s November 10th DJI, November 10th!

Also, it’s disappointing to see CGI used for something that you could plausibly do with real drones.

DJI Spark ]

Before we can reliably estimate the position and orientation of real blackberry flowers, this initial experiment was done with "QR flowers". Next research step: grow genetically modified plants that show QR codes on flowers?

[ WVU ]

Anki has a new video series about their little robot cars jumping stuff. It’s kinda hokey in a force-feeding you hilarity way, but this particular episode features a hometown favorite of mine, Voodoo Donuts, so it gets a pass.

Anki ]

Sawyer’s latest software update provides stats on exactly how much better it is than you:

Intera® 5.2 is an expansion of our Intera software platform that provides critical data insights to manufacturers in real time. Collaborative robot Sawyer™ gives operators and line managers valuable data at a glance, including metrics such as cycle time, part count, speed and force – data that has never before been available through a single collaborative robot vendor.

Rethink Robotics ]

As self-driving cars push the transportation revolution, we reflect back on the competition that started it all. Today, on the 10-year anniversary of the DARPA Urban Challenge, Torc has grown its original team, gained a decade of experience, and continues to push its tech farther to make the impossible possible.

Torc CEO Michael Fleming and Program Manager Jesse Hurdus recall Victor Tango’s 3rd place win in the DARPA Urban Challenge, an event that brought together a winning team of engineers that have celebrated 10 years of growth in the autonomous vehicle industry. They tell the story of their autonomous car, Odin, navigating a challenging 60-mile course without a driver, and elaborate on Torc’s growth in the past ten years.

TORC is doing some impressive stuff; here one of their autonomous cars avoids a deer crossing the road at night:


These robo-baloons from X, Alphabet’s advanced tech division, launch from a site in Nevada and fly all the way to Puerto Rico, where they are delivering basic internet connectivity to more than 100,000 people. Puerto Rico’s telecom infrastructure, heavily damaged by Hurricane Maria in September, remains in bad shape. X teamed up with AT&T, T-Mobile, and the local government authorities to help out with balloon-beamed internet. In this video, Project Loon launch specialist and Puerto Rican Pedro Emmanuelli describes what it’s like to help bring connectivity back to the island.

[ Project Loon ]

Meanwhile, AT&T has deployed a FLYING COW to Puerto Rico to restore connectivity:

I too was disappointed that it appears to be a tethered helicopter called COW (Cell on Wings) and not an actual flying cow. Sigh.

[ AT&T ]

From Erle Robotics, the first completely modular arm robot based on H-ROS, the Hardware Robot Operating System:

[ Erle Robotics ] via [ H-ROS ]

Seismic surveying requires placing a large number of sensors (geophones) in a grid pattern, triggering a seismic event, and recording vibration readings. The goal of the surveying is often to locate subsurface resources. Traditional seismic surveying employs human laborers for sensor placement and retrieval. The major drawbacks of surveying with human deployment are the high costs and time, and risks to humans due to explosives, terrain, and climatic conditions. We propose an autonomous, heterogeneous sensor deployment system using unmanned aerial vehicles to deploy mobile and immobile sensors. The proposed system begins to overcome some of the problems associated with traditional systems.

Paper ] via [ Aaron Becker ]

Small and medium enterprises in Europe often refrain from using advanced robot technology. The EU-project Factory-in-a-Day aimed at changing this by developing a robotic system that is to set up, operational in 24 hours and is flexible, leasable and cheap. The video shows the results and achievements of the project after 4 years of work.

[ Factory in a Day ]

TIAGo has a wide range of mobility thanks to its 12 Degrees of Freedom (without end-effector). Its 7 Dof arm and lifting torso enables TIAGo to reach objects from the floor up to 1.70m height.

Learning by demonstration software application for which TIAGo can pre-record movements and repeat them as many times as requested.

[ PAL Robotics ]

It’s now been 10 years since the DARPA Urban Challenge, but it doesn’t seem likely that CMU will forget their win anytime soon.

[ CMU ]

Earlier this week, MegaBots ran a live fight event after getting some backlash about their non-live fight event last month. Round 1 lasted under a minute because one of the giant robots had its power plant explode; rounds 2 and 3 were slightly better.

I mean, I guess I get the appeal of a giant robot with a human inside? But honestly, Combots is way more dynamic and fun to watch, IMO.

[ MegaBots ]

MOBOT, the motorcycling robot developed by SRI and Yamaha, is now up to Version 2. This long-ish video goes through a lot of the technical details, including some innovative actuators.


This week’s CMU RI Seminar comes from Dmitry Berenson, University of Michigan, on “What Matters for Deformable Object Manipulation”:

Deformable objects such as cables and clothes are ubiquitous in factories, hospitals, and homes. While a great deal of work has investigated the manipulation of rigid objects in these settings, manipulation of deformable objects remains under-explored. The problem is indeed challenging, as these objects are not straightforward to model and have infinite-dimensional configuration spaces, making it difficult to apply established approaches for motion planning and control. One of the key challenges in manipulating deformable objects is selecting a model which is efficient to use in a control loop, especially when an accurate model is not available. Our approach to control uses a set of simple models of the object, determining which model to use at the current time step via a novel Multi-Armed Bandit algorithm that reasons over estimates of model utility. I will also present our work on interleaving planning and control for deformable object manipulation in cluttered environments, again without an accurate model of the object. Our method predicts when a controller will be trapped (e.g., by obstacles) and invokes a planner to bring the object near its goal. The key to making the planning tractable is to avoid simulating the motion of the object, instead only forward-propagating the constraint on overstretching. This approach takes advantage of the object’s compliance, which allows it to conform to the environment as long as stretching constraints are satisfied. Our method is able to quickly plan paths in environments with complex obstacle arrangements and then switch to the controller to achieve a desired object configuration.

[ CMU RI ]

The Conversation (0)