Who’s Liable for George Hotz’s Self-Driving Software?

Can George Hotz escape liability if something goes wrong with his open-source self-driving car software?
Illustration: iStockphoto

You could download Comma.ai’s new open-source Python code from Github, grab the necessary hardware, and follow the company’s instructions to add semi-autonomous capabilities to specific Acura and Honda model cars (with more vehicles to follow). Comma.ai’s CEO George Hotz told IEEE Spectrum last week that Comma.ai’s code has safety features, but what would happen if there’s a bug and your car crashes into a building, another car, or a pedestrian? Self-driving-cars are notoriously difficult to test for safety.

Hotz writes in an emailIt's not my code, I did not release it”—Comma.ai Inc. “released and maintains it.” Most legal experts that spoke with IEEE Spectrum—and Hotz himself—believe that if you use the company’s code and something goes wrong, then it isn’t liable for damages. You are. 

But Consumer Watchdog advocate John Simpson doesn’t believe this is fair. He says Hotz “was somewhat responsible” for any damage that could occur. Although responsibility gets “murkier” as more developers modify the code, he says Hotz made it public, and should therefore be held liable as well as the user.

The controversy exists in part because autonomous driving legislation is just starting to take shape around the world. Automakers such as Tesla and Volvo have publicly declared that they will take responsibility for faults in their self-driving software. Comma.ai Inc’s release, however, puts the liability risk in an even grayer area because the code is open source.

There have been only a handful of open-source lawsuits worldwide—for example, open-source code improperly ending up inside commercial products. The European Commission proposed legislation to apply consumer protection rules to software in 2009, but today there are no EU-wide rules for faulty digital content. And in the United States, a series of court cases in the 1990s ruled open source code as free speech protected under the First Amendment of the U.S. Constitution.

As of Monday, Hotz stood protected by a standard MIT open source license to indemnify him from any legal claims and shift legal responsibility to the downloader. Comma.ai Inc. includes the following disclaimer in a README document:

“THIS IS ALPHA QUALITY SOFTWARE FOR RESEARCH PURPOSES ONLY. THIS IS NOT A PRODUCT. YOU ARE RESPONSIBLE FOR COMPLYING WITH LOCAL LAWS AND REGULATIONS. NO WARRANTY EXPRESSED OR IMPLIED.”

Heather Meeker, a partner at O’Melveny & Myers LLP who specializes in IT and software issues, writes in an email that “Open source is, in the sense of warranties, a ‘buyer beware’ model.”

If Hotz sold a product that included open source software, or made guarantees to car manufacturers that it worked properly, then Hotz could be held liable for damages. Or if a car manufacturer used Hotz’s software, then the manufacturer could potentially be liable if the cause of damage was the software, she says. But otherwise, risk would shift to whoever uses the code.

Unless, of course, a court ruled that the software really is a product.

Larry Coben, a lawyer at Anapol Weiss who specializes in product liability cases, says “I don’t know” whether the software counts as a product. He gave two examples. A supplier who provides an airbag for a car could be found liable if the module doesn’t perform to expectations, or a person selling an alarm system could be liable if he or she installed it carelessly.

Meeker says it’s not possible to disclaim or contract away product liability for personal injury. For example, you can’t sell defective lightbulbs that explode and sign a contract that protects you from liability.

It also might not be enough to say Hotz’s code is free. She points out that you don’t have to receive money to be viewed as a “seller.” For example, if you give away lightbulbs on the street and they explode, killing people, you could be held liable.

Kit Walsh, an attorney at the Electronic Frontier Foundation, writes in an email that “code is like a set of instructions, and publishing instructions generally cannot be punished consistent with the First Amendment—even if those instructions involve something dangerous like making a weapon or to try to eat a spoonful of cinnamon that will probably wind up in your lungs.”

Lawrence Rosen, a lawyer at Rosenlaw who specializes in open source software and open standards, agrees. The code is “not a product,” he says.

Walsh writes that “Prosecutors and plaintiffs often urge courts to disregard traditional First Amendment protections in the case of software because it has a ‘functional’ aspect when it can be executed, making it easier for the instructions it describes to be followed”—but that’s an argument she doesn’t believe is meaningful. Especially in this case, where extra steps are needed to combine the software with hardware and manual intervention.

However, Walsh notes that if you intend to cause harm with your speech, then you can be held liable for damages. Rosen says inducing others to use the code improperly or unsafely, or failing to properly warn individuals, could put Hotz at risk.

Walsh says that it might be unlawful to drive an unsafe car, “actively conspire” to circumvent vehicle safety legislation, or sell defective car modification hardware. Or in some cases, it might be lawful to modify a vehicle and drive it on private roads, but not public roads.

Michael Overly, a partner at Foley & Lardner LLP who works on technology-related agreements such as software licenses, sees another way Hotz could possibly be at risk. He says that while the MIT license and disclaimers protect developers from being held liable by whoever downloads the software, they do not protect him from lawsuits by third parties—people not bound by the terms of the agreement. Overly says that although Hotz has opened up a whole new area and open sourcing the code is a great idea for testing purposes, a third party who suffers an injury or property damage caused by a software malfunction could still sue for negligence—from anywhere in the world.

Overly says that in order to get full protection, Hotz would need an additional license or contract that warrants the user will use it for its intended purpose (research), he or she will not harm individuals or property, and he or she will pay any financial damages if the developer is sued (additional indemnity). Overly says these extra terms would work in conjunction with an existing open source license.

And there might be other lawsuits on the horizon. Rosen expects laws to change and software installed in any moving vehicle to require certification by a federal agency.

Meeker writes that this “could change the answer [of who’s liable, but] I would expect those regulations, like the general principles of products liability, to apply to actual products rather than software.”

The U.S. National Highway Traffic Safety Administration did not respond to a request for comment.

“If I were in [Hotz’s] shoes, I would not take [the situation] lightly,” Rosen says. He says it cost about $250 nowadays to file a lawsuit and much, much more than $250 to retain an attorney to defend against a lawsuit.

Hotz says he has spoken with lawyers and heard from NGOs, think tanks, and freedom-fighting organizations. “We can easily defend against any frivolous lawsuits,” he says.

This post was corrected on 14 December to clarify that the software was released by Comma.ai not George Hotz himself.

Advertisement

Cars That Think

IEEE Spectrum’s blog about the sensors, software, and systems that are making cars smarter, more entertaining, and ultimately, autonomous.
Contact us:  p.ross@ieee.org

Editor
Philip E. Ross
New York City
Contributor
Willie D. Jones
New York City
 
Senior Writer
Evan Ackerman
Washington, D.C.
Contributor
Mark Harris
Seattle
 

Newsletter Sign Up

Sign up for the Cars That Think newsletter and get biweekly updates, all delivered directly to your inbox.

Advertisement