Automated Facial Recognition: Menace, Farce, or Both?

UK police trials continue to highlight weaknesses in real world AFR use and their implications for civil liberties

4 min read

Photo-illustration of facial recognition.
Photo-illustration: John Lund/Getty Images

The American Civil Liberties Union (ACLU) and several other groups dedicated to protecting civil rights and liberties recently sent a letter (pdf) to Amazon demanding that it stop selling its automated facial recognition (AFR) system called Rekognition to government agencies, especially police departments. According to the groups, doing so “poses a grave threat to customers and communities across the country.”

In its letter, the ACLU argues that Amazon, which has in the past opposed secret government surveillance, should not be in the business of selling AFR technology that the company claims can “identify people in real-time by instantaneously searching databases containing tens of millions of faces.” Further, the ACLU insists, Rekognition’s capability to track “persons of interest,” coupled with its other features which “read like a user manual for authoritarian surveillance,” lends itself to the violation and abuse of individuals’ civil rights.

Amazon naturally disagrees. It responded to the letter with a statement that says: 

Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn’t buy a computer because it was possible to use that computer for illegal purposes? Like any of our AWS services, we require our customers to comply with the law and be responsible when using Amazon Rekognition.

Amazon’s response, of course, begs the question of how would it know that its technology was being “abused,” especially because there are few laws defining specifically how automated facial recognition can or cannot be used? Georgetown University published a study a few years ago titled, The Perpetual Line-up: Unregulated Police Facial Recognition in America. The report revealed that nearly half the adults in the United States are already in some law enforcement AFR networks. It also showed just how few regulations are in place to ensure that AFR is not being used improperly.

Even where there are regulations in place, they aren’t necessarily followed. For instance, in 2016, the U.S. Government Accountability Office (GAO) published a report declaring that the FBI didn’t always adhere to federal privacy laws and policies affecting AFR use. Furthermore, it had not taken steps to help ensure the accuracy of its face recognition technology. A 2017 GAO follow-on report stated that the FBI still had not taken the actions needed to protect the public’s privacy when automated facial recognition is used. On a different but related note, the efficacy of AFR in generating investigative leads remains in doubt. 

According to the FBI, automated facial recognition is used to pull as few as two or as many as fifty individual likely candidates from its database of over 400 million photos (as of December 2015) on file. Once candidates are identified, each image is manually reviewed by agents to determine whether it is a match for someone being sought. Even if there is a positive match, says the FBI, the information can be used only as “an investigative lead, and not as a means of positive identification.” The FBI claims its AFR accuracy is a “minimum of 85 percent of the time within the top 50 candidates,” which is pretty poor. Its AFR false-positives rate has not been measured, nor does the FBI seem keen on doing so.

The accuracy of automated facial recognition is at the paradoxical heart of the ACLU’s concerns in its letter to Amazon. On one hand, if an AFR system’s accuracy is poor, it will have a high false-positive rate. Consequently, a large number of innocent people may become an “investigative lead” for crimes with which they have no connection to at all. Recent studies show that while AFR systems are very accurate at identifying Caucasian males, they are much less reliable with respect to just about everyone else, especially minority ethnic women.

AFR systems’ accuracy quickly degrades in situations where the conditions are less than ideal, meaning there either isn’t a high-quality image available, there aren’t good lighting conditions, or the person’s face can’t be seen straight-on with a neutral expression. Automated facial recognition trials by the Metropolitan Police in London using CCTV systems have been shown to have a 98 percent false-positive identification rate, while trials by the police in South Wales showed a 92 percent false-positive rate. However, neither police force regards misidentifications as false-positives as long as no one is arrested, even if an innocent person is investigated as a result of a mistaken identification. Both police forces are going forward with AFR, despite the inaccuracies, which does not make the UK Information Commissioner entirely happy.

Given real-world results of AFR, the ACLU and other civil libertarians are worried about Amazon’s Rekognition system for what seems like the opposite reason. They’re concerned because AFR technology is quickly improving to a point where high degrees of accuracy will be possible in the near future under even poor visual conditions—before the legal constraints on AFR’s use have been put into place. It is exceptionally difficult to put restrictions on a technology used by law enforcement once it is in de-facto practice, albeit not impossible.

Automated facial recognition’s cost of entry is also falling quickly, encouraging law enforcement agencies of all sizes to use it. For instance, a mid-sized police force in Oregon has uploaded 300,000 booking photos into Amazon’s Rekognition system. The cost: $400 upfront and a $6 monthly fee. With Amazon’s market reach and attractive AFR price point, one can confidently predict that most law enforcement agencies around the country will be buyers of Rekognition or a similarly priced system from a competitor within the next five years.

AFR researchers and vendors claim that these systems can now accurately determine a person’s sexual orientation, ethnicity and whether the person is lying—all from recordings made available via police body cameras.

The opportunities for governmental abuse are obvious. In China, where AFR accuracy is claimed to be 99.8 percent, the technology has been enthusiastically embraced by the government to help ensure political and social control of its 1.4 billion citizens. While Western governments may not go to the same extreme, it is not difficult to imagine that in times of societal unrest, the temptation to use facial recognition technology for similar purposes will be hard to resist.

The ACLU is savvy enough to know that its letter to Amazon will come to naught, but it does serve to highlight to a wider public audience the privacy and civil liberty risks posed by unrestricted government use of AFR technology. Whether politicians will be sufficiently moved by the dangers highlighted by the ACLU so that they begin putting legal constraints on automated facial recognition use anytime soon is another question altogether.

The Conversation (0)