Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

The Tesla Autopilot Crash Could Have Been Much Worse

The next time, a crash may kill innocent bystanders. Would that kill the Autopilot feature?

3 min read

Tesla Motors CEO Elon Musk
Photo-Illustration: Getty Images

Five days after the world learned that the driver of a Tesla on Autopilot had died in a crash, there are still a lot of unanswered questions. 

The first unanswered question—if we go in a strictly chronological order—has to do with the timing. The crash happened on 7 May, but we learned of it 56 days later—on the Thursday evening that inaugurated the long holiday weekend in the United States ending in 4 July.  

It’s an old trick, one that General Motors pulled back in 1987. I was then a mere cub reporter who’d left work a little early for the long Thanksgiving weekend. Minutes later, at the very close of business hours, GM reported a recall of its Fiero two-seater, whose engine had shown an unfortunate tendency to burst into flames

Fool me once, GM, shame on you. Fool me twice, shame on me.

A second unanswered question is whether the Tesla’s driver—Joshua Brown, 40, a former Navy SEAL—had been paying proper attention. One report said that a “Harry Potter” movie was heard playing on a DVD in the wrecked car. Another report denied that any such thing happened.

If Brown had in fact been paying attention but simply lacked time to intervene, then that’s really bad news for Tesla and for its policy of using its customers to beta-test technology. Google has repeatedly insisted it would release no self-driving tech until professional drivers have proved it to be safe. Volvo is carrying system redundancy to great lengths. Tesla is alone in playing things fast and loose.

A third unanswered question is what Tesla intends to do to prevent more such accidents from happening. The company is due to release version 8.0 of Autopilot, and though it will have many improvements it seems that none of them would have averted the 7 May crash.

But the fourth question is probably what troubles Elon Musk’s sleep the most, and it’s purely hypothetical. What if that Tesla had crashed not into a huge truck but into something smaller than itself?

The laws of physics provide part of the answer: the smaller vehicle in a crash generally gets the worst of it. But the laws of public relations give you the more important part: Tesla’s name would be mud. If—or shall I say when—a Tesla plows into a bunch of school children, or a peloton of cyclists, or even just a small car, people will die who never chose to run the risk of testing Autopilot.

Brown’s family or his insurance company probably have no case against Tesla. After all, the company  has always made it clear that the driver had to watch Autopilot all the time, that all legal liability was on the driver’s shoulders. It has even made it hard for drivers to take their hands off the wheel for any length of time.

But from the first day Autopilot became available to Tesla owners, some of them have been pushing the envelope, often with glee, as the YouTube videos they post make clear. These guys are taking their lives in their hands, but any future victims of their behavior will be innocents.

Such behavior does help test the technology, and that’s a good thing. You can argue that we modern people are often so averse to risk that we slow down the development of technologies that promise to lower risk further for us all. The U.S. Food and Drug Administration is perhaps to willing to delay potentially life-saving drugs and medical devices. The Federal Aviation Administration has obviously spent more time ruminating on regulations for drones than is justified by the danger they present.

But every person has the right to avoid taking a newfangled risk. And how can we do that when we must share the road with beta-testers?

The Conversation (0)