A Former Pilot On Why Autonomous Vehicles Are So Risky

5 questions for transportation-safety expert Missy Cummings

3 min read

A photo illustration of ​Missy Cummings with colors and shapes.
Stuart Bradford

In October 2021, Missy Cummings left her engineering professorship at Duke University to join the National Highway Transportation Safety Administration (NHTSA) in a temporary position as a senior safety advisor. It wasn’t long before Elon Musk tweeted an attack: “Objectively, her track record is extremely biased against Tesla.” He was referring to Cummings’s criticism of his company’s Autopilot, which is supposed to help the driver drive, though some customers have used it to make the car drive itself—sometimes with disastrous results.

Some of Musk’s fans followed his lead: Cummings received a slew of online attacks, some of them threatening.

Missy Cummings

Mary (Missy) L. Cummings is the director of the Autonomy and Robotics Center at George Mason University and a senior member of IEEE. She received a Ph.D. in systems engineering from the University of Virginia.

As a former Navy fighter pilot, Cummings was used to living dangerously. But she hates taking unnecessary risks, particularly on the road. At NHTSA, she scrutinized data on cars operating under varying levels of automation, and she pushed for safer standards around autonomy. Now out of the government and in a new academic perch at George Mason University, she answered five high-speed questions from IEEE Spectrum.

We are told that today’s cars, with their advanced driver-assistance systems (ADAS), are fundamentally safer than ever before. True?

Cummings: There is no evidence of mitigation. At NHTSA we couldn’t answer the question that you’re less likely to get in a crash—no data. But if you are in an accident, you’re more likely to be injured, because people in ADAS-equipped cars are more likely to be speeding.

Could it be that people are trading the extra safety these systems might otherwise have provided for other things, like getting home 3 minutes sooner?

Cummings: I call it risk homeostasis. It’s a big problem with Tesla, for example. You’re told it has self-driving capability, with all these features, such as automatic braking. Oh, the car is going to do x, y, and z for me, and then it turns out that it doesn’t.

Did you observe risk homeostasis back in your fighter-pilot days?

Cummings: It happened with air-to-ground bombing radar. Pilots figured out that you could use it to set up a self-contained approach to an aircraft carrier and then manage the landing by yourself. Given the control freaks that pilots are, it happened. But the system didn’t adjust for the pitching deck, so it set people up for much more lethal approaches.

Some have said that partial autonomy is the riskiest solution of all. What’s your take?

Cummings: The policy should be that either the computer is driving or you are driving. And by driving I mean steering—people do fine with regular cruise control. The act of keeping your hands on the wheel and guiding the car’s lateral motion is enough to keep your brain engaged. So, no L3 [full self-driving, but the driver must be ready to take the wheel], which is too confusing, and no hands-free L2 [partial self-driving]. I am not against the passing of control per se, but there should just be two modes of operation, with crystal clear feedback about which mode you are in.

When do you think true self-driving cars will come?

Cummings: It’s possible to do self-driving in narrow applications. Waymo has been giving rides for a long time in Chandler [Ariz.]. That environment is very structured, and it’s much easier to operate these systems in. My favorite application is last-mile delivery, say, food delivery; it could be very helpful when, say, viruses spike. But the day when AI in cars can handle all conditions on the road, all of the time—it’s not going to be in my lifetime.

This article appears in the June 2023 print issue as “5 Questions for Missy Cummings.”

The Conversation (11)
Johan Cornelis Kleynhans
Johan Cornelis Kleynhans20 Jun, 2023
LM

The lack of cited data supporting the contention that automated driving will lead to speeding is unfortunate - why would such behaviour be encouraged by an automated system per se? Many drivers drive too fast for lots of reasons - nothing to do with automation.

Raymond Keefe
Raymond Keefe16 Jun, 2023
SM

Great article and I fully agree with the point Missy Cummings makes about which mode the vehicle is in. I'm driving (assisted maybe but I am taking responsibility) or the vehicle is currently fully responsible. And also appalled that we continue to see online abuse and threats when genuine concerns are raised. Good critique helps us make products and processes better.

Tom Craver
Tom Craver14 Jun, 2023
INDV

Cummings says NHTSA has "no data" to indicate Autosteer reduces Tesla crash rates.

She says Autosteer is dangerous because people will drive faster (no data cited) and so will be more likely to be injured if they crash.

If people were 10% more likely to be injured in a crash due to speeding with Autosteer, but crashes were 40% lower, you'd be 2/3rds as likely to be injured using AutoSteer.

So basically NHTSA has no real data regarding Autosteer safety (shouldn't they?), but she feels it's dangerous, and makes up "risk homeostasis" to sound scientific.