Last month, Hyundai quietly held its 2014 Future Automobile Technology Competition in South Korea. Out of 12 participating teams, four made it to the final round, which required the cars to navigate a test circuit. The autonomous cars were required to avoid obstacles, stop for pedestrians, obey traffic laws, and do all of the stuff that self-driving cars will have to be able to do if we’re ever going to be able to hop in, plug in a destination, and turn our attention elsewhere. The competition wasn’t anything that we haven’t seen before—except that during the second day of the competition, it rained.
It looks like the team from KAIST’s Unmanned Systems Research Group wasn’t required to let its car traverse the course while it was actively raining, but it started out immediately after a heavy shower, on a wet road. The surface was slippery, certainly, but the real problem is that depending on the angle between the car, wet surfaces, and the sun, the car’s cameras can have a difficult time recognizing all kinds of things, including lane markings and street signs.
What’s particularly interesting here is that KAIST has posted videos from both days of the competition: it’s the same car, on the same track, with the same hardware, running the same software. The only difference is that the road is dry on the first day, and wet on the second day.
We’ve embedded both videos below, but I’d suggest watching them with YouTube Doubler via this link, which plays both videos at the same time, side by side. Or, you can just hit play on both of the embeds below. In either case, if you want to do a comprehensive side-by-side viewing, you’ll need to occasionally skip ahead on the wet track video, since that car has a few, um, issues that it has to work out. Highlights from the second video, when the track was wet, are beneath both vids.
2:02 – Hyundai hits the e-stop when the car appears to miss the center line and veer off to the side of the road. The car is turning straight into the sun when this happens. The car had no issues here during the first day’s “dry run,” and kept to the correct side of the road.
2:50 – Car fails pedestrian detection. Again, it looks a bit like the sun is right behind the pedestrains. The car also did fine here during the dry run.
3:20 – A bit of uncertainty at the intersection here, relative to the dry run.
4:15 – Lane and road detection failure. Looks like the software may have had to be soft-reset?
5:43 – Oops. Car completely fails to detect the curb at the center of an intersection and comes very close to plowing into a light pole. During the dry run, unsurprisingly, it aced this.
7:20 – Hyundai again hits the e-stop. May have been an overabundance of caution with this one (KAIST certainly thinks so), but it does kinda look like the vehicle was heading for another curb. At the very least, it was not following the path that it was expected to follow, and followed a different path than it did when the roads were dry.
8:40 – Car detects one road sign but fails to detect another.
9:15 – During the dry run, the car ran into the barricades here, but did fine on the wet run. Go figure.
9:45 – The car does manage to back into the barricades while parking at the end of the wet run, though.
So, what have we learned?
The first step with any self-driving car is, of course, to get it working under optimal conditions. KAIST and many other groups are very close to being able to do this. But weather is a major source of uncertainty, for both human and robot drivers. Robots are likely to be better at dealing with slippery road surfaces, because they can do things such as controlling the acceleration or braking of each wheel of a car individually to maximize traction. Humans can’t do this.
But the weather-related challenges for autonomous cars are, at this point, nearly all in perception. Wet roads are an issue, but so is heavy fog, rain, and snow. Combining any of these conditions with darkness only compounds the problem. Autonmous cars generally understand the rules of the road, and they know what to do and what not to do in virtually any situation that they might face. But if they don’t have enough accurate information from their sensors, any decision that they make is likely to be a poor one.
We want to thank KAIST’s Unmanned Systems Research Group for posting these videos, even if the competition didn’t go as well as the team might have wanted. We’re very used to seeing the successes of autonomous cars, but it’s much more interesting (and educational) to see what challenges them. And hey, at least KAIST didn’t end up in a ditch:
Good job signaling, though.
[ KAIST USRG ] via [ Business Korea ]
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.