Moscow is using this relatively new technology to make sure people with COVID-19 remain quarantined in their homes. The Transportation Security Administration, the FBI and local law enforcement officials use it to help identify terrorists and criminals. Retailers use it to study traffic patterns and customer demographics. And some of you use it to unlock your phones.
Facial recognition technology has some state lawmakers, government officials, businesspeople and consumers excited about its possible uses and benefits. Yet others warn of its potential abuse of privacy and are seeking ways to limit the use of the technology.
Growing Fast as Accuracy Improves
Facial recognition systems use algorithms—sequences of instructions grouped to solve problems or make computations—to identify the faces of individuals in photographs. The accuracy of the technology has advanced quickly in recent years. Between 2014 and 2018, facial recognition software grew 20 times better at searching a database to find a matching photograph, according to a study of 127 different vendors’ algorithms by the National Institute of Standards and Technology, released late last year. Some companies claim their software is more than 99% accurate.
Recent studies, including NIST’s, however, have also uncovered limitations. The NIST study found that the algorithms’ ability to match two images of the same person varied among demographic groups, with dramatically higher rates of false positives for Black, Asian and Native American faces than for white faces. The study noted, however, that “not all algorithms give this high rate of false positives across demographics,” adding “those that are the most equitable also rank among the most accurate.”
Some Tech Companies Pull Out, Others Expand
Earlier this summer, concerns about inaccuracies and the potential misuse of biometric data caused several major technology companies to announce they would stop the sale of facial recognition technology to governments. IBM said it would no longer offer general-purpose facial recognition or analysis software out of concern it could be used for mass surveillance or racial profiling. Amazon pledged a one-year moratorium on selling the technology to police departments. And Microsoft stated it would wait until Congress passes a federal law regulating facial recognition before selling it to police departments.
But hundreds of other companies are continuing in the facial recognition business. Apple and Google use facial authentication to give users of their products secure and sole access to devices and the photos and other information stored through them. These companies have refrained from selling their facial recognition products to law enforcement.
The current or potential practices of other companies are raising serious concerns, however. Clearview AI, founded just three years ago and based in New York City, has amassed a database of 3 billion facial images by scraping the internet, including social media sites. That volume of images has enabled it to be more accurate than the human eye, company co-founder Hoan Ton-That has said. The company has sold or shared its software with more than 600 police departments in the U.S. and Canada, it says, as well as with some widely known businesses.
It’s doubtful that social media users expected the photos they posted would be used in this way, or that images they deleted or that were posted by others would be widely available for anyone to use.
Several companies, including Google, YouTube, LinkedIn and Twitter, issued cease and desist orders based on Clearview’s violation of terms of service provisions that prohibit scraping, and demanded that previously collected images be deleted. Clearview claims it collects only publicly available images and that it has a First Amendment right to do so.
Clearview also faces class-action lawsuits for violating Illinois’ Biometric Information Privacy Act and California’s Consumer Privacy Act. Vermont Attorney General T.J. Donovan (D) also has filed suit, alleging violations of that state’s consumer protection act and recently enacted data broker privacy law.
Facial Recognition and Law Enforcement
Most Americans trust law enforcement to use facial recognition responsibly—more than they trust advertisers or tech companies to do so, according to a 2019 Pew Research Center survey. When used properly, by officials who recognize that the results of a face recognition search are only possible matches and not positive identification, the tool can be very valuable.
Facial recognition helped find a convicted killer, who had escaped prison in 1977, when federal marshals used the technology to compare an early photo with images in a driver’s license database. Law enforcement officials are also using the technology to combat the trafficking of minors and other crimes.
But there are also real-life examples of the weaknesses of the technology—most recently, the case of Robert Williams, who is Black and was arrested and held for nearly 30 hours after facial recognition software used by the Detroit Police Department misidentified him as a shoplifter.
Commercial Uses of Facial Recognition
Facial recognition offers many consumer conveniences and benefits, such as paying for services without cash and boarding an airplane without having to show a boarding pass. The technology can lock and unlock mobile phones and can be used to authorize purchases and payments.
At the National Soccer Hall of Fame in suburban Dallas, exhibits using facial recognition let fans who have shared certain personal information build their own national team, create a scarf and design an MLS kit.
Smart home devices, like Amazon’s Ring and Google’s Nest, also are growing in popularity. They use facial recognition technology to distinguish familiar friends and family members from strangers and can record instances of package theft (aka “porch piracy”) from front doorsteps to share with police.
Commercial uses of facial recognition that Americans are less enamored with, according to Pew’s survey, include tracking people entering and leaving apartment buildings, employees arriving at and leaving work and consumers responding to advertising displays in real time.
Industry Self-Regulation and Standards
Private industry and government are developing policies, principles and best practices for the appropriate use of facial recognition. For example, the International Biometrics and Identification Association released “Privacy Best Practice Recommendations for Commercial Biometric Use,” and the IEEE Standards Association is working on specifications to help ensure the technology is used ethically. The Digital Signage Federation issued privacy standards for its members that address facial recognition, and the U.S. Chamber of Commerce has published policy principles to guide policymakers as they consider legislative proposals. In addition, the Federal Trade Commission has issued best practices.
These organizations generally address issues such as transparency, consent, notice, choice, retention of data and security practices.
State Privacy Protection for Consumers
No federal laws address commercial uses of facial recognition, but three states have privacy protections in place for consumers. The Illinois Biometric Information Privacy Act, passed in 2008, prohibits commercial entities from capturing an individual’s biometric identifiers (fingerprints, voice and typing patterns, for example) or selling or disclosing a biometric identifier without written consent. There are some exceptions, such as for law enforcement purposes. The law also places security and retention requirements on collected biometric data. Texas and Washington have similar laws, but only Illinois’ includes a private right of action, which has led to several lawsuits, including one against Clearview AI.
The California Consumer Privacy Act defines biometric data, including facial recognition data, as personal information that is protected by the law. It gives California consumers the right to access their personal data and to request that it be deleted and not sold to third parties. It also imposes obligations on businesses, including a responsibility to protect the data.
Also, of the 50 states with security breach laws, at least 22 have added biometric data to the categories of personal information that must be reported if breached. Other state laws require businesses or government to have security measures in place to protect any personal information, including biometric data, they hold.
Checks on Government and Law Enforcement
This year, Washington state enacted one of the most comprehensive laws governing the use of facial recognition by government. It prohibits state agencies and law enforcement from collecting or using a biometric identifier without first providing notice and obtaining an individual’s consent. State agencies also must establish security policies and meet certain requirements regarding the use, storage, sale and sharing of biometric information.
The law requires state agencies to report regularly on their use of facial recognition technology and to test the software for fairness and accuracy. Except in emergencies, law enforcement agencies must obtain warrants before using the technology in investigations.
The bill faced opposition from some law enforcement groups and organizations such as the Consumer Federation of America and the Electronic Frontier Foundation. Representative Matt Boehnke (R) led opposition to the final bill. “I don’t believe we’ve worked out a lot of the issues,” he says, citing problems with security, transparency and accountability. He had favored a one-year moratorium on the use of facial recognition, due to concerns about bias in the technology. Boehnke, who has a technical background, currently serves as the director and lead professor of the cybersecurity division at Columbia Basin College and has served in the military. “I’m pro-technology, but I’m looking for the best way to balance privacy with the innovations that these new technologies have to offer us.”
Although Representative Debra Entenman (D) says the bill “was a good compromise which leaves open the opportunity to improve as we move forward,” she has concerns. Earlier in the session, she sponsored a bill to place a three-year moratorium on the use of facial recognition technologies.
"We want to make sure for all citizens that facial recognition, if used, is as accurate as possible,” she says. “If the police in this country are going to use a tool, it must be an effective tool, and currently the research still says that the tool is not effective, and it disproportionately misidentifies African Americans.”
“To me,” she adds, “this is not a partisan issue; this is a civil liberties issue.”
Laws in California, New Hampshire and Oregon address the use of facial recognition systems in police body cameras. New Hampshire and Oregon prohibit installing or using the software in body cams, and California placed a three-year ban on the practice last year.
With the implementation of comprehensive privacy laws in Europe and California, and several high-profile breaches, many predicted that 2020 would be a bellwether year for more state privacy legislation. Indeed, there was a significant increase in the introduction of bills addressing the collection and use of biometric and facial recognition data.
At least 19 state legislatures considered proposals to limit the use of biometrics by government or law enforcement, with three states debating bills to place a moratorium on facial recognition or prohibit its use in police body cams.
The same number of legislatures also considered the use of biometric data with bills prohibiting the capture of such information for a commercial purpose without consent or banning its use by landlords. And about half the states proposed including biometric data in their definitions of personal information as part of comprehensive privacy legislation like California’s.
Very few bills have been enacted, however, as the coronavirus pandemic disrupted legislative sessions and dramatically changed priorities for the year.
Many Americans appreciate the convenience and security facial recognition technology offers but are wary about companies and the government collecting personal data. Although Americans said they trusted law enforcement to use facial recognition responsibly when surveyed in 2019, recent events may have altered those perceptions for some. Not surprisingly, lawmakers are keeping those changing attitudes in mind as well.
Pam Greenberg tracks privacy, cybersecurity and technology issues for NCSL.
Additional NCSL Resources