Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Video Friday: Roboy, AI Ethics, and Big Clapper

Your weekly selection of awesome robot videos

7 min read

Erico Guizzo is IEEE Spectrum's Digital Innovation Director.

Big Clapper
Image: Big Clapper/Bye Bye World via YouTube

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

AIM 2018 – July 9-12, 2018 – Auckland, New Zealand
ICARM 2018 – July 18-20, 2018 – Singapore
ICMA 2018 – August 5-8, 2018 – Changchun, China
SSRR 2018 – August 6-8, 2018 – Philadelphia, Pa., USA
ISR 2018 – August 24-27, 2018 – Shenyang, China
BioRob 2018 – August 26-29, 2018 – University of Twente, Netherlands
RO-MAN 2018 – August 27-30, 2018 – Nanjing, China
ELROB 2018 – September 24-28, 2018 – Mons, Belgium
ARSO 2018 – September 27-29, 2018 – Genoa, Italy
IROS 2018 – October 1-5, 2018 – Madrid, Spain

Let us know if you have suggestions for next week, and enjoy today’s videos.

We wrote about HAMR back in February, and it’s now grown some cute lil’ footpads that allow it to walk on water, and also sink (on purpose).

In nature, cockroaches can survive underwater for up to 30 minutes. Now, a robotic cockroach can do even better. Harvard’s Ambulatory Microrobot, known as HAMR, can walk on land, swim on the surface of water, and walk underwater for as long as necessary, opening up new environments for this little bot to explore.

[ Nature ]

The Roboy team is beginning a series of videos where they explain interesting robotics papers in 2 minutes flat. Good luck!

In this video we discuss the paper: Task-oriented Grasping with Semantic and Geometric Scene Understanding" by Renaud Detry, Jeremie Papon, Larry Matthies. How should robots grasp objects? Does a robot knowing to differentiate between a ball and a cup help it to grasp the object better? We answer this here.

[ Roboy ]

MIT Media Lab is hosting a course on "The Ethics and Governance of AI," and there are six classes worth on YouTube right now.

This Spring 2018 term course is a cross-disciplinary investigation of the implications of emerging technologies, with an emphasis on the development and deployment of Artificial Intelligence. The course covers a variety of issues, including the complex interaction between governance organizations and sovereign states, the proliferation of algorithmic decision making, autonomous systems, machine learning and explanation, the search for balance between regulation and innovation, and the effects of AI on the dissemination of information, along with questions related to individual rights, discrimination, and architectures of control.

[ MIT Media Lab ]

Robot Start has a review of Big Clapper. TLDR: You need this, and so does everyone else.

See?!

[ Big Clapper ] via [ Robot Start ]

This video focuses on David Hall’s invention of 3D LiDAR as the foundational technology of autonomous vehicles. It shows Velodyne LiDAR’s origins as a result of the DARPA Grand Challenge. This is also a great abbreviated overview of the Challenge, and the beginning of the autonomous revolution, with Velodyne LiDAR as the catalyst.

[ Velodyne ]

As a partner robot, it never leaves the side of its master. It transforms into a vehicle that augments its master’s physical functions—motional and sensory—and travels with the master as one. It is a machine lifeform produced from the latest robotics and AI technologies fused by product design. Its name is CanguRo.

[ furo ]

Skip to the second half of this video from ICRA to see some new footage of Honda’s E2-DR disaster robot, including some (now obligatory) shots of someone shoving it.

[ ICRA 2018 Spotlight ]

ROBOTIS has produced a series of 13 videos designed to help you learn ROS from scratch. It’s an entire curriculum, really, complete with a 500-page book and a bunch of resources on GitHub, all open source.

What you will learn from this course:

- From the basic concept to practical robot application programming!

- ROS Basic concept, instructions and tools

- How to use sensor and actuator packages on ROS

- Embedded board for ROS : OpenCR1.0

- SLAM & Navigation with TurtleBot3

- How to program a delivery robot using ROS Java

- OpenManipulator simulation using MoveIt! and Gazebo

[ YouTube ]

This is some of the most lifelike walking I’ve ever seen from a humanoid robot at this scale.

[ DrGuero2001 ]

A reminder, from a robot, to be more kind to each other. RoboThespian performs an extract from Charlie Chaplin’s powerful speech at the end of The Great Dictator. Voice is the audio recording of Chaplin himself, which is available in the public domain. (Although the movie itself is not). RoboThespian was animated using browser based, drag drop software "virtual robot". Arguably, this demonstrates RoboThespian’s acting versatility.

[ RoboThespian ]

Thanks Michael!

Photon, a little education robot from Poland, has been named best early-stage startup in Central and Eastern Europe by the European Business Angels Network Congress. Congrats!

Photon is the first robot in the world that develops together with your child. It makes a child’s first steps in the world of new technology easily accessible, fun and most importantly educational. Photon is integrated with child-friendly applications for smartphones and tablets that let them interact with their robot by giving him commands. The apps were designed to be intuitive, easy to use and to allow the children be a part of some memorable adventures! They let kids program the robot in an intuitive graphical programming language, designed specifically for children.

[ Photon ]

Researchers at Cluster of Excellence CITEC (Bielefeld, Germany) study human grasping and manipulation skills and try to replicate them in multi-fingered robot hands. The key component towards achieving this goal are tactile sensors that are integrated into the hands, which provide detailed force profiles during interaction. Employing appropriate tactile processing algorithms, the hands know when they lose an object or when an object starts to slip (so as to increase grasping force). They can also explore the surface of unknown objects by using tactile surveying methods. A tactile data glove records this force profile during human interaction with objects, thus allowing researchers to learn from humans. Together with high-accuracy vision-based hand-pose tracking, this provides researchers with valuable information that allows them to better understand and replicate human manual intelligence.

[ CITEC ]

An update from Google Brain and X on scalable deep reinforcement learning for robotic manipulation.

[ Google AI Blog ]

We present a novel computational approach to designing robotic devices from high-level motion specifications. Our computational system uses a library of modular components— actuators, mounting brackets, and connectors—to define the space of possible robot designs. The process of creating a new robot begins with a set of input trajectories that specify how its end effectors and/or body should move. By searching through the combinatorial set of possible arrangements of modular components, our method generates a functional, as-simple-as possible robotic device that is capable of tracking the input motion trajectories. To significantly improve the efficiency of this discrete optimization process, we propose a novel heuristic that guides the search for appropriate designs. Briefly, our heuristic function estimates how much an intermediate robot design needs to change before it becomes able to execute the target motion trajectories. We demonstrate the effectiveness of our computational design method by automatically creating a variety of robotic manipulators and legged robots. To generate these results we define our own robotic kit that includes off-the-shelf actuators and 3D printable connectors. We validate our results by fabricating two robotic devices designed with our method.

[ Disney Research ]

More IROS work from Kodlab, featuring Minitaur using its forelimbs to cutely push objects around.

We demonstrate the physical rearrangement of wheeled stools in a moderately cluttered indoor environment by a quadrupedal robot that autonomously achieves a user’s desired configuration. The robot’s behaviors are planned and executed by a three layer hierarchical architecture consisting of: an offline symbolic task and motion planner; a reactive layer that tracks the reference output of the deliberative layer and avoids unanticipated obstacles sensed online; and a gait layer that realizes the abstract unicycle commands from the reactive module through appropriately coordinated joint level torque feedback loops. This work also extends prior formal results about the reactive layer to a broad class of nonconvex obstacles. Our design is verified both by formal proofs as well as empirical demonstration of various assembly tasks.

[ Kodlab ]

Researchers at the Cluster of Excellence CITEC are investigating human-machine interaction with Pepper, a humanoid robot. The robot’s capabilities include navigation, detection of objects and people, and grasping. Pepper can also learn new movements with the help of a human counterpart. Using Augmented Reality, the person interacting with Pepper can see how the robot perceives the world. In addition to this, Pepper also serves as a museum guide and competes in the household service league of the RoboCup robotics championship.

[ CITEC ]

A nice overview of Festo’s Bionic Learning Network robots.

[ Festo ]

It’s penalty shootout time between the Mekacademy and Delta Squads - with some national support for MekaMon’s home country, England!

From Reach Robotics comes MekaMon, the world’s first gaming robot. Blending the virtual and real worlds with augmented reality battles and head to head combat, MekaMon represents Next Level Gaming & AR.

It always kinda annoys me when robot videos fake things when it seems like they could totally have gotten the robot to do it for real, you know?

[ Mekamon ]

In the UIST Student Innovation Contest (aka the "SIC"), we explore how novel input, interaction, actuation, and output technologies can augment interactive experiences! This year, in partnership with Makeblock, we are seeking students who will push the boundaries of input and output techniques with our unusual human-robot interaction challenge! Join the UIST SIC and turn your ideas into reality! Win fabulous prizes!

We will give away the hardware to all selected teams! You can bring it home with you!

The UIST SIC is your opportunity to shine and impress the world with your creative ideas! Participants will demo their work during the demo reception at the conference in Berlin, Germany, and contest winners will be announced at the conference. A jury of UIST judges will select two winners for the Most Creative and Best Implementation categories. On top of that, conference attendees will get a chance to vote for their favorite teams in the People’s Choice category. All categories receive prizes!

[ UIST SIC ]

I’m really not sold on the whole autonomous flying taxi thing, but here’s a recent TED Talk on it anyway.

Flight is about to get a lot more personal, says aviation entrepreneur Rodin Lyasoff. In this visionary talk, he imagines a new golden age of air travel in which small, autonomous air taxis allow us to bypass traffic jams and fundamentally transform how we get around our cities and towns. "In the past century, flight connected our planet," Lyasoff says. "In the next, it will reconnect our local communities."

[ TED ]

The Conversation (0)