Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):
AUVSI Xponential – May 8-11, 2017 – Dallas, Texas, USA
AAMAS 2017 – May 8-12, 2017 – Sao Paulo, Brazil
Austech – May 9-12, 2017 – Melbourne, Australia
Innorobo – May 16-18, 2017 – Paris, France
Midwest Robotics Workshop – May 18-19, 2017 – Chicago, Ill., USA
NASA Robotic Mining Competition – May 22-26, 2017 – NASA KSC, Fla., USA
IEEE ICRA – May 29-3, 2017 – Singapore
University Rover Challenge – June 1-13, 2017 – Hanksville, Utah, USA
IEEE World Haptics – June 6-9, 2017 – Munich, Germany
NASA SRC Virtual Competition – June 12-16, 2017 – Online
ICCV 2017 – June 13-16, 2017 – Venice, Italy
RoboBoat 2017 – June 20-20, 2017 – Daytona Beach, Fl., USA
Aerial Robotics International Research Symposium – June 21-22, 2017 – Toronto, Canada
Hamlyn Symposium on Medical Robotics – June 25-28, 2017 – London, England
Autonomous Systems World – June 26-27, 2017 – Berlin, Germany
RoboUniverse Seoul – June 28-30, 2017 – Seoul, Korea
RobotCraft 2017 – July 3-3, 2017 – Coimbra, Portugal
Let us know if you have suggestions for next week, and enjoy today’s videos.
To celebrate National Robotics Week the Florida Institute for Human and Machine Cognition recently hosted the annual Robotics Open House at the newly constructed Levin Center for IHMC Research. IHMC Researchers demonstrated work across multiple platforms including the Atlas Humanoid robot, MinaV2 Exoskeleton, Planar Elliptical Runner, and Virtual reality stations.
That little scampering robot on the treadmill? It’s called Planar Elliptical Runner. It’s new, and here’s more about it:
The Planar Elliptical Runner is a running robot that can run about 12 miles per hour on a treadmill. It is sandwhiched between two plates of glass, which keep it in the sagittal plane. It cannot turn or tip sideways, but it can tip forward/backward and fall down. The robot is "open loop stable". There is only 1 motor driving the legs. There are no sensors or computers on board. An RC car radio controller determines how much power to apply to the motor. Give the robot more power and it runs faster. Less power and it runs slower. In simulation we are exploring how to extend this concept to 3D running. Initial simulations are showing promising results.
Even from just looking at the simulation of the next generation, you can see how the foot placement changes from parallel to overlapping. We’re excited to both see this thing in action, and hear about how they made it work.
[ IHMC ]
Sonia works in a lab right next to the Ghost Robotics folks, and for Star Wars Day, she dressed Minitaur as a cute little AT-AT, thanks to a dog costume stuffed with bubble wrap:
If AT-ATs could actually move like this, the Rebel Alliance wouldn’t have had a chance.
[ Ghost Robotics ]
I won’t spoil who won the 2017 Portuguese Open midsize robot soccer tournament, but here’s a hint: These videos of the semifinals and finals both come from TU Eindhoven.
That passing technique just gets better and better every year.
I like how she says “Mommy’s trying to vacuum.” No, mommy is busy giving confusing instructions to the dog while Roomba does the vacuuming.
This grasping research from ASU’s Interactive Robotics Lab shows what it takes to train a robot to do something very simple that humans can do without even thinking:
[ ASU ]
More than you ever wanted to know about Festo’s BionicCobot.
Hey, it runs ROS!
[ Festo ]
Most drone manuals advise you not to fly when it’s raining M&Ms. Here’s what it looks like if you do it anyway:
Confetti is also mesmerizing, although you don’t really need to watch more than the first 20 seconds of either of these videos.
[ DRL ]
It’s not like any beach I’ve ever been to, but this 360-degree mosaic of Ogunquit Beach on Mars shows where one of the loneliest robots in the solar system is right now:
[ NASA JPL ]
Here are the final matches from the FIRST Robotics world champtionship in St. Louis (skip to 3:24):
If you’re super confused about what’s going on, don’t worry, that’s normal for the uninitiated. Also, note that the robots are only autonomous for the first 15 seconds of the competition: The rest of the time, they’re being remote controlled. Hopefully FIRST will increase the autonomous time in the future, since the biggest challenges in robotics are really in software nowadays. Anyway, here’s the second final:
Unmanned: This documentary film examines autonomy and its effects on humans and society. It takes a behind the scenes look at the research and implementation of autonomous systems. In particular, the documentary focuses on humanoid robots, robotic milking systems, and self-driving cars. It addresses the following questions: How much automation is too much automation? What kind of work should humans be doing versus what kind of work should robots be doing? What is the biggest Impact that robots will have on humans? Unmanned reaches the ultimate conclusion that robots are a reflection of ourselves and therefore help us better understand our own nature.
“Unmanned” was produced by Alyson Campbell and Seth Boudreau from Saint Michael’s College in Vermont.
[ SMC ]
NASLab at Michigan Tech celebrated National Robotics Week by giving a tour of the lab and its robotic projects.
[ NAS Lab ]
Here are a pair of talks from Michigan Robotics Day. The first one is from Talia Moore, postdoctoral fellow in ecology and evolutionary biology at the University of Michigan, on “A Symbiotic Relationship Between Bio-Inspired Robotics and Evolutionary Biology.”
And second is University of Michigan’s Jessy Grizzle, highlighting some of the robotics research he’s been working on.
Two CMU RI Seminars for you this week. The first is Selma Sabanovic on Robots for the social good:
Robots are expected to become ubiquitous in the near future, working alongside and with people in everyday environments to provide various societal benefits. In contrast to this broad ranging social vision for robotics applications, evaluations of robots and studies of human-robot interaction have largely focused on more constrained contexts, largely dyadic and small group interactions in laboratories. As a result, we have a limited understanding of how robots are perceived, adopted and supported in open-ended, natural social circumstances in which researchers have little control of the ensuing interactions.
This talk will discuss insights from a series of studies of the design and use of socially assistive robots (SARs) for eldercare aimed at expanding our awareness of the broader cultural, organizational, and societal dynamics that affect the use and consequences of robots outside the laboratory. Our in-home interviews with older adults suggested that existing robot designs reproduce unwanted stereotypes of aging, while naturalistic observation of robot use in a nursing home shows that ongoing labor by various groups of users is needed to produce successful voluntary human-robot interactions. In response to these findings, we are currently engaging in participatory design of robots with older adults and clinicians to provide an opportunity for mutual learning, inspire both sides to think beyond common stereotypes of older adults and robots, and identify non-technical issues of particular concern to clinicians and older adults that may affect long-term robot adoption. These concerns include the fit of robots to the home environments and values of older adults, to the labor practices and clinical needs of care staff, and to the broader healthcare infrastructure (e.g. insurance mechanisms). In conclusion, I will discuss ways to address broader organizational and societal issues in the course of robot design and development, working together with potential users and other stakeholders to avoid unwanted consequences and create robust social supports that can cope with the inevitable challenges that emerge when we apply robots in society.
And second is Sidd Srinivasa on Robotic Manipulation under clutter and uncertainty with and around people:
Robots manipulate with super-human speed and dexterity on factory floors. But yet they fail even under moderate amounts of clutter or uncertainty. However, human teleoperators perform remarkable acts of manipulation with the same hardware. My research goal is to bridge the gap between what robotic manipulators can do now and what they are capable of doing. What human operators intuitively possess that robots lack are models of interaction between the manipulator and the world that go beyond pick-and-place. I will describe our work on nonprehensile physics-based manipulation that has produced simple but effective models, integrated with proprioception and perception, that has enabled robots to fearlessly push, pull, and slide objects, and reconfigure clutter that comes in the way of their primary task. But human environments are also filled with humans. Collaborative manipulation is a dance, demanding the sharing of intentions, inferences, and forces between the robot and the human. I will also describe our work on the mathematics of human-robot interaction that has produced a framework for collaboration using Bayesian inference to model the human collaborator, and trajectory optimization to generate fluent collaborative plans. Finally, I will talk about our new initiative on assitive care that focuses on marrying physics, human-robot collaboration, control theory, and rehabilitation engineering to build and deploy caregiving systems.
[ CMU RI ]