THE INSTITUTEJust as in George Orwell’s 1984, facial recognition is being used today to identify, analyze, and track people in real time. Although the technology has its benefits—such as its ability to help find criminals and missing people—many people have privacy concerns and worry about possible bias in the code.
In the United Kingdom, citizens are urging the government to restrict use of the software for just those reasons.
In May San Francisco became the first U.S. city to ban the use of the technology by its police force and city agencies. The ban was prompted by concerns that facial-recognition surveillance unfairly targets and profiles certain members of society, especially people of color. Somerville, Mass., and Oakland, Calif., have since followed suit. The cities’ officials cited a lack of technical standards for the ban.
The IEEE P7013 Inclusion and Application Standards for Automated Facial Analysis Technology working group is addressing those concerns. The team is investigating the scope of facial-recognition algorithms and associated metrics that could be standardized.
“IEEE is moving away from writing standards that are strictly related to a technology, process, or device into those that address societal concerns,” says IEEE Senior Member Marie-Jose Montpetit, chair of the IEEE P7013 working group. “Facial recognition happens to be a technology that has many such concerns, such as privacy, racial and gender bias, and inaccuracy.
“As a member of the public, I am also concerned that facial-recognition software might be used in inappropriate ways by law enforcement and other organizations. My interest is making sure that we develop the standards necessary for the technology to be used ethically and when necessary.
“As a citizen, I agree with the temporary bans, because I realize a lot of the algorithms in the software aren’t accurate enough to be considered nonbiased. Once the algorithms improve, the regular use of this software may become more acceptable.”
Organizations have started to use facial-recognition technology without going through an acceptable approval process and without considering the impact the technology could have on the general population, she says. “For example,” Montpetit says, “there is a restaurant in the United Kingdom using it to make sure patrons are being served in the correct order. This issue makes the role of our working group even more important.”
Almost every company now using facial-recognition technology is monitoring its own applications, and there are no regulatory agencies making sure the technology is being used ethically. An investigation of New York City police officers, for example, found they edited suspects’ photos—and uploaded celebrity lookalikes—in efforts to manipulate facial-recognition results.
“Every week there’s a news article about the unethical use of the technology, specifically in relation to privacy concerns,” Montpetit says. “One issue with facial-recognition software is that it’s inaccurate.
“The technology has been continuously accused of being biased against certain groups of people. I think it's important for us to define mechanisms to make sure that if the technology is going to be used, it’s used in a fair and accurate way.”
She says the working group will define acceptable metrics.
“IEEE recognizes that it’s important to make sure this technology and its algorithms are standardized, have well-defined use cases and metrics, and are well-understood,” she says. “There’s also a lot of diversity within the organization, whether that’s in the area of expertise, gender, or type of organization. IEEE can take every point of view into consideration while developing the standard—which is important.”
The working group is reviewing a variety of situations in which the software was used appropriately and inappropriately to give its members a well-rounded view.
“By reviewing the use cases, we can define a set of compliance measures,” Montpetit says. “We need to define a common set of features for the technology, such as facial analysis, capturing and analyzing movement, and activity detection. Another goal is to create a guide for how to ethically use the software.”
It’s the responsibility of IEEE and other engineering organizations to standardize the technology, Montpetit says: “I think it’s up to us—engineers, academics, and social scientists—to define the boundaries of what the technology can do and how it does the job. Then, when it becomes an issue of using the software for the public, it is the role of the government to oversee its uses.”
Click here to join the IEEE P7013 working group.
Joanna Goodrich is the associate editor of The Institute, covering the work and accomplishments of IEEE members and IEEE and technology-related events. She has a master's degree in health communications from Rutgers University, in New Brunswick, N.J.