Transportation

Why Our Company’s Trucks Won’t Carry Lidar

The cofounder of Starsky Robotics says the sensor is superfluous when human teleoperators are supervising things

Photograph of Starsky's truck.
Photo: Starsky Robotics

This is a guest post. The views expressed in this article are solely those of the blogger and do not represent positions of IEEE Spectrum, or the IEEE.

Last year a partner at a well-known Silicon Valley venture firm wouldn't take a meeting with me and my colleagues because our autonomous driving system didn't employ lidar. He said vehicles had to use all available sensors to ensure safety.

I disagree. I'll explain why, but first let me lay out the approach of my company, Starsky Robotics.

Unlike many other companies, we are not trying to automate every bit of driving, above all those parts that involve navigating crowded, chaotic urban environments. Rather, we specialize in long-distance trucking.

The problem we're addressing is the shortage of truck drivers. It's easy to see why they're getting scarcer: Trucking's a tough and tedious job, demanding long hours on the road every day and weeks, even months away from family and friends. And without enough truck drivers, the availability and price of goods must rise.

We're automating the easiest part of the task, which is highway driving. The rest of the job remains in the hands of experienced truck drivers. They operate the rigs remotely when getting on or off the highway and during the tricky situations that occasionally happen on the highway itself. Such interventions take up less than 1 percent of the driving time.

Take a typical journey, the run that goes from a distribution center in Hayward, Calif., to another distribution center in Georgia. The warehouse in Hayward is about 5 kilometers (3 miles) from the highway. Then there's 4,200 km (2,613 miles) of driving along Interstate-class highways. And then, in Georgia, the destination distribution center is less than 2 km from the highway. So the off-highway driving comes to about 6 km (4 miles).

Considering that only a fraction of the driving is on local roads, we are automating the 99 percent of the journey that is on highways. (Actually, in the case of the California-to-Georgia journey, that's 99.85 percent, but you get the idea.)

Highways are designed with long straightaways and gentle curves, which makes driving easy for humans at legal speed limits. And remember, all we have to help us is our human eyes and our human reaction times.

That's why my company began with the tool that's closest to the human eye: the camera. They're automotive grade, which means they're engineered to work in road vehicles at all phases of their working life; also, they are both relatively cheap and available off the shelf. Another great thing about cameras is that they are highly customizable. Our prototype truck employs seven different cameras, each one configured and oriented to monitor a specific field of view, for a full 360 degrees of coverage.

For more reliability in our measurements we use automotive-grade radar. This, too, is a well-documented, well-understood sensor that is also available at a reasonable price. Radar is really good at sensing the existence of a potential obstacle and measuring its velocity. It's not so good at identifying its precise location, and it also tends to create a lot of false positives; a manhole cover can loom larger than it is. So, to filter out the false positives, we fuse the data stream from radar with the stream from the cameras. Each sensor covers the weaknesses of the other.

That brings us to lidar. Why not add a third set of eyes, as it were?

Lidar technology, applied to autonomous driving, is a good science project that gets in the way of engineering

There are indeed things that lidar does well. It's really good at noticing obstacles close to the ground. And, because its laser provides illumination all its own, lidar can see as well in the dark as in the sunlight. But this sensor suffers from serious weaknesses.

Limited range is at the top of the list of weaknesses. A fully loaded truck moving at highway speed needs a lot of space to stop, at least 150 to 200 meters, and the long-range lidars now available can't sense things in sufficient detail that far ahead. They return points that are too spread apart to provide the information required, rendering them useless.

Reliability is also limited. I mentioned that we use components that are automotive grade, that we're certain will work over the lifetime of our vehicles, in all the various conditions we require. Lidar isn't there yet. Some of the units tend to spin themselves apart; others are stuck together with glue. Many will fall apart after three to six months of use.

It is possible that lidar would help classify obstacles that are tricky for cameras and radar—for instance a mattress, a person, or an alligator. However it's easier to preliminarily classify such a thing as a “strange object," and then ask for help from a human teleoperator. Remember, we're using teleoperators to supervise our system when conditions become tricky. Our system doesn't have to know what that strange obstacle is up ahead, only that it's strange.

We are thus solving a highly constrained problem. Teleoperators drive the few miles on and off the highway and then, for the long, boring stretches of highway driving, autonomous systems maintain the rig at a constant speed on straight-line or gently curved roads. That's why cameras and radar are sufficient to the task.

Engineering is the application of science with real-world constraints. Lidar technology, applied to autonomous driving, is a good science project that gets in the way of engineering. At Starsky Robotics, we don't use lidar because we don't need to.