I don’t know why, but a lot of interesting technology companies all seem to be clustered around Kitchener, in Southern Ontario, Canada. Clearpath Robotics is up there, and they’ve taken advantage of their neighbors like Thalmic Labs to put together cool crossover demos. Turns out that audio visual company Christie, which, among many other things, makes high-end digital projectors, is just 10 minutes away from Clearpath, so they all got together for a hack week and developed a sort of augmented reality game using projectors and real robots. See it running, and see lots of other robot videos, too. It’s Friday!
I WANT A ROBOT HOLODECK.
It’s a bit disappointing not to see more actual gameplay footage, but Ryan Gariepy explains some of it in a blog post:
We had two robots and everyone loves video games, so we wrote a new package that uses Python (via rospy), GDAL, and Shapely to create a real-life PvP game with our Jackals. Each Jackal was controlled by a person and had the usual features we all expect from video games—weapons, recharging shields, hitpoints, and sound effects. All of the data was rendered and projected in real-time along with our robots’ understanding of their environment.
And, as a final bonus, we used our existing path planning code to create a entire “AI” for the robots. Since the robots already know where they are and how to plan paths, this part was done in literally minutes.
There’s lots more technical detail at the link below, for those of you who like knowing (say) how they managed to coordinate multiple robots in ROS.
[ Clearpath ]
Yes, last week included April Fool’s Day. Yes, there were some robot-related videos. Robohub has a bunch of them, but there are two particularly good ones that I’d like to highlight. The first is from Tesla:
Some of this is silly, for sure, like the avoidance of a human writing a ticket. But there’s no reason why a Tesla (once they’re self-driving) couldn’t be programmed to interact with parking meters, and move themselves to a new parking spot autonomously when the time runs out. Self-parking cars have been around for a while, and low-speed around-the-block type driving is certainly not he hardest thing for the current generation of autonomous cars.
The second video is from IHMC’s DRC Team:
Just look at that speed and stability! Those bio-inspired joints! Five-fingered grippers! The only problem seems to be that it’s tethered for power, which IHMC is going to have to solve before the DRC Finals. My suggestion: try feeding it biomass.
[ IHMC ]
Are you ready for the DRC Finals in June? DARPA doesn’t think you're ready enough:
DRAAAAMAAAAA!!! But seriously, it’s going to be great.
[ DRC Finals ]
This weekend is Easter Sunday, do you know where all of your Easter eggs are?
[ WVU IRL ]
DARPA's System of Systems (SoS) Integration Technology and Experimentation (SoSITE) program aims to develop and demonstrate concepts for maintaining air superiority through novel SoS architectures--combinations of aircraft, weapons, sensors and mission systems--that distribute air warfare capabilities across a large number of interoperable manned and unmanned platforms.
The vision is to integrate new technologies and airborne systems
with existing systems more quickly and at lower cost than near-peer adversaries can counter them.
[ DARPA ]
I’m sure that we’ve posted this video before, but a new upload from MIT’s Personal Robots Group gives us a good excuse to post it again. It’s one of my absolute favorites:
Cookie Monster is very bad. He’s very bad, Leo.
[ MIT ]
I think it's safe to say that the little robot gymnast has surpassed all human gymnasts, and this is only version 26 with no end in sight:
[ hinamitetu ]
Here is the entire description for this video:
“My Time Machine - escaped by a mistake of my gardener - had a defect. You won’t guess where it arrived...
Further: must make a dictionary Morlock-English”
[ prallplatte ]
SparkFun’s 2015 Autonomous Vehicle Challenge has a new quirk this year: behold, the Discombobulator:
Coming to you June 20 at SparkFun in Niwot, Colo.
[ SparkFun AVC ]
The sheepdog lobby is going to be all up in paws about this:
[ Irish Post ]
The Autonomous Space Robotics Lab (ASRL) at University of Toronto Institute for Aerospace Studies (UTIAS) has been working on a tethered ground robot called TREX, which stands for Tethered Robotic Explorer. It’s based on a Husky, and with that tether, it can run until the heat death of the Universe (or as long as you can provide it with power):
[ ASRL ]
On March 28-29, 2015, Uplift Aeronautics trained a group of Arab-Americans—including refugees from the conflicts in Iraq and Syria—to operate a fleet of four Waliid UAVs for medical deliveries to inaccessible populations. Families took part in every activity, from packing cargo and making parachutes to running preflight checklists and operating ground stations.
[ Syria Airlift Project ] via [ DIY Drones ]
From the Georgia Tech Healthcare Robotics Lab, a few years back:
The robot Darci (with a 7 degree of freedom arm and series elastic actuators) extracting a set of keys from artificial foliage using model predictive control with a tactile sensing sleeve. We use a full dynamic model of the robot arm in contact with the world. We also include a collision constraint to limit impact forces. Video is in real time. The controller has no map of the environment and extracts the keys from a known target location using a magnet attached at the end effector.
[ Georgia Tech ]
Forbes has a nice overview video of the progress that Matternet has been making towards drone delivery in a context that actually makes it a good idea:
[ Forbes Video ]
I’m sure every single robotics company has trouble with engineers falling asleep on the job. Intuitive Robots has the solution: a NAO plus an Android smartwatch plus some really loud and annoying alarms:
[ Intuitive Robots ]
We’ve kind of stopped posting drone cinematography vids, because after a certain point, the amount of uniqueness that you can expect from them starts to get really limited. Here’s a first, though: a drone carrying the 30 pounds of camera gear required to support a Phantom Flex4k ultra HD camera. The subject of its test shoot is not very inspiring, but you’re looking at 4K video at 1,000 FPS, and it’s up on YouTube at that resolution (if your monitor supports it):
[ Brain Farm ] via [ Gizmodo ]
Cameras aren’t the only thing you can attach to a drone. How about a buzzsaw?
[ Tested ]
Aww, R2-D2 is looking for love! Will he find it in someone other than C-3PO?
[ Artoo In Love ] via [ Boing Boing ]
And finally, two nice long talks with two Peters to round out the week:
Peter Asaro: “Regulating Robots: Developing Policy for Robots”
Robotics stands on the cusp of an explosion of applications and wide-spread adoption. Already the development and popular use of small UAV drones is gaining momentum, self-driving cars could be market-ready in a few short years, and the next generation of fully-autonomous military drones are in development. Yet the regulatory policies necessary to ensure the social and economic benefits of these technologies are not yet in place. The FAA has struggled to devise operational regulations for small UAV drones, and has not yet addressed the privacy concerns they raise. Google has influenced state legislatures to pass laws permitting self-driving cars, yet the liability issues and insurance regulations are open questions, as are the safety requirements for these cars to interact with human drivers. And while the United Nations has begun discussions over the possible need to regulate fully autonomous weapons, the development of such systems continues at rapid pace. I will present my work on some of these issues, as well as ask whether a more comprehensive regulatory framework might be able to address the questions of ensuring public safety and privacy in the coming revolution in robotics.
[ Stanford CIS ]
Peter Robinson: “Computing with Emotions”
The ability to display and recognise emotions is an important aspect of social interaction between humans. We monitor each other's facial expressions, vocal nuances and body posture and gestures, and use them to make inferences about other people's mental states. Our understanding of mental states shapes the decisions that we make, governs how we communicate with others, and affects our performance. People express these social signals even when we are interacting with machines, but computer interfaces currently ignore them. In effect, computers are autistic. Recent advances in psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. Computer systems with emotional awareness can analyse a person's facial expressions, tone of voice and body posture and gestures. They infer a person's underlying mental state, such as whether he or she is agreeing or disagreeing, interested or bored, thinking or confused. Similar techniques endow humanoid robots with the ability to display the same signals. The techniques have applications in areas such as monitoring cognitive load in command and control operators, guiding on-line teaching systems and enhancing the sense of presence in teleconference systems.
[ CMU RI Seminar ]
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.