I’m not sure if there’s any sort of metric for how successful Aldebaran Robotics’ Pepper robot is at working in customer service. People are certainly buying them, but by itself, that doesn’t say much about whether it works and whether people actually like it and find it effective. In any case, Hitachi wants to get a piece of whatever Pepper’s going after, and to do so, they’ve upgraded their EMIEW robot for a customer assistance role.
Video Friday is your weekly selection of awesome robotics videos, collected by your fluid-filled Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
We haven’t heard anything from SCHAFT over the past three years, and all we know is that they’re now part of X, Alphabet’s experimental technology lab. Somehow, Rubin convinced them to show up to his NEST keynote, and they brought a brand new bipedal robot along with some absolutely incredible video of what they’ve been up to.
Today is christening day for DARPA’s Sea Hunter, a full-scale prototype of an autonomous surface vessel that’s designed to be able to launch from a pier and go out on its own for weeks or months at a time, for thousands of miles at a stretch.
The 132-foot-long, diesel-powered vessel was built by U.S. defense contractor Leidos under DARPA’s ACTUV program, a somewhat clunky nested acronym that stands for Anti-Submarine Warfare (ASW) Continuous Trail Unmanned Vessel.
Are you building a robot that’s supposed to autonomously navigate in a useful way? Cool, that means you’ll be needing a LIDAR system, then. For better or worse, it’s usually just that straightforward: LIDAR is arguably the best sensor we have right now for reliable navigation, localization, and obstacle avoidance for ground robots. In terms of relatively low-cost sensors, sonar is poor resolution and short range; structured light and time-of-flight sensors are short range and don’t work well outdoors; and camera-based vision systems aren’t robust enough for reliable navigation.
The “relatively low cost” bit is the problem: LIDARs are pricey, and an “affordable” 2D unit, with a range of 10 meters or less, can cost you over US $1,000. This is an enormous problem for both hobbyists and cost-conscious commercial robotics developers (i.e. every single commercial robotics developer).
Humanoid robots can sense the world around them, move their bodies, and interact with people in ways that are similar to the ways that real people interact. But a robot’s “human-ness” is (at least for now) all just a simulation. It’s a combination of clever software, and in some cases, hardware that’s designed to make it easy for us to fool ourselves into thinking that some glorified box of circuits is even a little bit like a person. We’re very, very good at fooling ourselves like this, to the point where it starts to get a little weird.
Researchers from Stanford University will present a paper at the Annual Conference of the International Communication Association in Fukuoka, Japan, in June, with the title of “Touching a Mechanical Body: Tactile Contact With Intimate Parts of a Human-Shaped Robot is Physiologically Arousing.” The study shows that when a NAO robot asks humans to touch its butt, we get uncomfortable. This is weird because NAO doesn’t really have a butt in the traditional sense of the word, and even if it did, it’s just a robot, and on a very basic level it doesn’t care where you touch it. So what is going on here?
Today, Kinema Systems, a robotics startup based in Menlo Park, Calif., is coming out of stealth mode to announce Kinema Pick, which is “the world’s first self-training, self-calibrating software solution for robotic depalletizing.” I know, it sounds a little dry, but they have a convincingly cool demo, and we have lots of details on how the system works (and why it’s important) from Kinema co-founder and CEO Sachin Chitta.
Video Friday is your weekly selection of awesome robotics videos, collected by your gullible Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
To do capable and useful things, your robot needs capable and useful sensors, which is just another way of saying that your robot needs you to spend a lot of money on it. This is really too bad, because hardware cost is enormously restrictive for robots, especially ones that are intended to be affordable by people who haven’t co-founded a robotics startup or something (I think there are a few people left who have yet to do this). In particular, distance sensors that allow your robot to detect and avoid obstacles tend to be both very useful and very expensive, but if you want one that works reliably outdoors, start saving, because they cost thousands of dollars.
At MIT, a group of researchers led by Professor Li-Shiuan Peh designed a phone-based laser rangefinder that costs a total of $49, plus a smartphone that you’re not using anymore. Is it the greatest laser rangefinder ever? Not even close. But for less than $50, it’s pretty darn great anyway.
Bipedal walking has worked out pretty well for humans. I guess. We’re kind of stuck with it until someone comes up with something better. And the really frustrating part is that all kinds of animals have already come up with better ways of getting around: specifically, birds and insects, who use wings to fly as well as legs and feet to walk. This multimodality makes birds and insects inherently versatile and adaptable, which is why you can find them doing quite well just about everywhere.
Some of the most versatile and adaptable robots also exhibit multimodal characteristics: they can fly and climb, or jump and glide, or even fly and swim. But flying and walking seems to be by far the most useful combination, as evidenced by the variety of animals that can do it, and researchers at the University of Pennsylvania’s GRASP Laboratory have designed a new robot called Picobug that can fly, walk, and even (soon) grab on to stuff.