According to Georgia’s Attonery General Chris Carr, the state is only one of three, along with Virginia and Alaska, without a cybersecurity law that makes it illegal for someone to remotely access your computer and search it for sensitive information, and then sell it to a third party. Presently, it is only illegal in Georgia to access a computer to delete or tamper with its contents. However, this will change if Georgia Senate Bill 315: The Computer Intrusion Bill is finally passed into law.
One could be forgiven for thinking, well, it’s about time. However, cybersecurity experts are worried that SB315 as written is so open-ended that it could potentially make a range of legitimate security research and other innocuous activities into criminal offenses. According to the Electronic Frontier Foundation (EFF), a person doing personal work on their business computer could be at risk of being charged, as would security researchers looking for vulnerabilities on corporate or government websites, or others who scrape online information from public websites. The Georgia ACLU calls the bill “draconian,” while others worry that cybersecurity firms will be negatively affected.
SB315 as currently written states that any person who accesses a computer or computer network with knowledge that such access is without authority (such as unauthorized password disclosure) will be guilty of the crime of unauthorized computer access. Exceptions are for parents of children under the age of 18 or for persons conducting legitimate business activity. The penalty for a conviction is a misdemeanor of a high and aggravated manner, which can net a person a US $5,000 fine and a year in jail.
In its broadest interpretation, under SB315, a person better have received permission first if they want to access any computer other than one he or she has explicit permission to access. Other states are more circumspect and base their laws more along the lines of the federal Computer Fraud and Abuse Act, which includes consideration of the intent of the person conducting such an unauthorized access, the EFF said.
The sponsors and the proponents of SB315 dismiss the EFF and others’ concerns as being wildly exaggerated. For example, Attorney General Carr asserts that SB315 “is not intended in any way, shape or form to criminalize legitimate behavior,” but he also doesn’t deny that such “legitimate behavior” could be very narrowly defined. Carr merely states that, “Our district attorneys, with their limited time and resources, are not going to spend any time trying to prosecute a roommate using the Netflix password.”
In other words—trust us not to abuse what is actually allowed in the law.
However, trust may be in short supply. A major impetus for SB315 came from a politically embarrassing incident last year, which initially began when security researcher Logan Lamb accidentally discovered in August 2016 a misconfigured server belonging to Georgia’s Kennesaw State University’s Center for Election Systems (CES) that revealed a “motherlode” of highly sensitive and supposedly secured information. This not only included information on Georgia’s 6.7 million voters, but also the instructions and passwords for election workers to sign in to the central server on Election Day. There were also other security-related issues that could be exploited to disrupt an election, such as potentially changing candidates’ votes. The details can be found in an interview with the Lamb in Politico magazine.
Given that the CES was responsible for testing and programming Georgia’s voting machines since 2002, this was more than a minor issue. Lamb informed the CES of what he had found, and though they thanked him, Lamb also told Politico that they warned him to keep quiet, else the “politicians … would crush him.”
Apparently, while the CES corrected some issues Lamb raised, the plethora of sensitive voter and voting information was still easily accessible in March 2017, and was again found by another security researcher Chris Grayson. CES was notified again, but this time the lack of security became public, and lo and behold, the researchers were investigated by the FBI to see whether a crime had been committed. They were cleared, but CES soon lost its contract to manage the state’s voting operations because it didn’t report Lamb’s initial discovery to state officials.
As word spread of the “breach,” Georgia played down its impact, especially on a special election that was happening in June of last year. A lawsuit was filed in August to overturn the very close election because the information that Lamb discovered in August 2016 had remained unfixed for at least eight months, and therefore, the election could have been deliberately tainted. Georgia maintains there is no evidence of vote tampering, but some experts ask—how would the state know if there were?
The lawsuit probably wouldn’t have gotten much traction, except for the fact that the CES deleted all the files on the server and its backups within days of the lawsuit being filed. Georgia officials insist that the erasure was not to deliberately delete evidence, but was just routine practice. And no one should worry anyway, state officials said, since the FBI had a mirror image of the server as part of its investigation into Russian attempts at interfering with the U.S. national elections. The FBI, however, will neither confirm nor deny it indeed has contents of the server.
If SB315 was in place before Lamb did his research, what’s the betting that he would not have been “crushed” by Georgia’s Attorney General once he disclosed the information to CES? Or would he, or anyone else who may have discovered the flaw, have disclosed it at all, given the risk of prosecution? Similarly, how many companies or government agencies would you trust not to use such a law to threaten prosecution if an embarrassing security flaw was found by a researcher or someone else?
Or instead, take the case of Stanford University, where an MBA student there discovered 14 terabytes of confidential data from the University’s financial aid applications for its Graduate School of Business online. The student told the University of the problem, which then immediately removed the data. However, the student kept a copy of the information (which did not have students’ names), and analyzed it. He published his findings in a 378-page report that exposed that Stanford was not being honest when it claimed “all fellowships are need-based” at its graduate business school. Stanford admitted that it had fibbed, and was going to be more transparent about its fellowships in the future.
I’ll let you predict what would have happened to this MBA student if what he did took place in Georgia after SB315 was in place.
Contributing Editor Robert N. Charette is an acknowledged international authority on information technology and systems risk management. A self-described “risk ecologist,” he is interested in the intersections of business, political, technological, and societal risks. Along with being editor for IEEE Spectrum’s Risk Factor blog, Charette is an award-winning author of multiple books and numerous articles on the subjects of risk management, project and program management, innovation, and entrepreneurship. A Life Senior Member of the IEEE, Charette was a recipient of the IEEE Computer Society’s Golden Core Award in 2008.