Driving a car in the middle of the desert at night without any headlights is easy for any driver. The trip becomes a little more challenging if there are obstacles the driver has to avoid, and harder still if there’s a road to navigate. At this point, most humans (those without immediate access to a high-quality night vision system) might start to have some trouble. Robots, being much better at this whole driving thing than humans are, don't really care whether there’s daylight or street lights, as some recent testing from Ford demonstrates.
Google, for the record, has been testing its autonomous cars on public roads in California after dark for a while now. Or at least, Google has been testing at night, which is not quite the same thing. At night, you have headlights and streetlights and other cars to help you out, but Ford is instead having its cars drive around a test track at night in the middle of the desert with no lights whatsoever.
While this is certainly a very cool demo (the highlight of which is actually seeing the LIDAR beams on infrared), there are a few things to keep in mind. The first thing is that darkness doesn't present much of a challenge to a navigation system that relies on LIDAR and a pre-existing map. In fact, LIDAR sensors have an easier time of it the darker it gets. Laser sensors generate their own light, but in order to function, they also have to be able to detect that light when it bounces off of an object and returns to the sensor. This works best when the beam strikes a very reflective object in a very dark place; as reflectivity goes down and ambient light goes up, the effective range of the LIDAR decreases. In other words, the brighter the ambient light, the harder it is for the LIDAR sensor to distinguish its laser from sunlight, which is why night makes things easier.
This problem of oversaturation isn't unique to LIDAR. It's an issue with traditional cameras as well, including the traditional kind of cameras that most of us have in our brainboxes. In particular, autonomous cars (and human drivers) have issues with two very common driving situations. The first driving into the sun when it’s low in the sky. The brightness of the sun makes it very difficult to see anything else, so we squint and put on sunglasses. Autonomous cars have cameras that do their version of squinting and putting on shades, but glare is still difficult to overcome. The second situation is a wet road on a sunny day, when sunlight reflecting off of the road makes it almost impossible to see things like lines and lane markers.
For more details on this, check out our post from November 2014, where we talked to a Korean research team about how issues like these caused problems for them during a self-driving car competition.
The other thing to consider with Ford's demo is that, great as LIDAR is, it can't help an autonomous car do everything that needs it to do—despite what the automaker's press release for some reason wants you to think:
“Thanks to LiDAR, the test cars aren’t reliant on the sun shining, nor cameras detecting painted white lines on the asphalt,” says Jim McBride, Ford technical leader for autonomous vehicles. “In fact, LiDAR allows autonomous cars to drive just as well in the dark as they do in the light of day.”
If the only sensor your autonomous car has is a LIDAR, then yes, this is true. However, any autonomous car supposedly designed to drive on an actual road (and not a test track) really does need cameras to detect painted white lines on asphalt, among many other things. LIDAR only helps a car respond to a 3-D map of the car's environment, which is great for avoiding obstacles but not great for, say, reading road signs, looking at traffic lights, or generally understanding what's going on in the world. Even with a heavily annotated 3-D map (so that the car can position itself correctly in the lane without looking at lines), and even assuming that at some point cars will be able to talk to traffic lights directly, an autonomous vehicle would still need cameras to help it deal with the inevitable environmental variability that happens in real world driving.
LIDAR is, for the foreseeable future, one of the most important and capable sensors that autonomous cars have. But it can't work alone. For robocars to safely and effectively navigate on urban roads, they need to fuse data from as many sensors as they possibly can, including LIDAR, cameras, radar, GPS, gyroscopes and accelerometers, and scrolls of prayer to the robot gods (if that would make any kind of difference). Even at night.
Evan Ackerman is a senior editor at IEEE Spectrum. Since 2007, he has written over 6,000 articles on robotics and technology. He has a degree in Martian geology and is excellent at playing bagpipes.