Shine a light on tech’s hidden impacts. Triple your donation.
Skip navigation

News

Police Say They Can Use Facial Recognition, Despite Bans

More than a dozen cities have passed facial recognition bans in the past couple of years, but police say there are loopholes

A closeup photo of a pair of eyes being scanned with facial recognition software
Ian Waldie / Getty Images

Mere hours after supporters of former president Donald Trump forced their way into the Capitol Building on Jan. 6, sleuths, both amateur and professional, took up the task of combing through the voluminous videos and photos on social media to identify rioters. Facial recognition technology—long reviled by police reform advocates as inaccurate and racially biased—was suddenly everywhere.

A college student in Washington, D.C., used facial recognition to extract faces from videos on social media. The Washington Post used facial recognition to count the number of individual faces at the Capitol Building attack, and a researcher from Citizen Lab used it to identify people involved in the riots. And when the FBI posted photos of rioters, looking for help with identification, the Miami Police Department assigned two detectives to scan faces into the department’s Clearview facial recognition app. 

The episode was a reminder that facial recognition software is now ubiquitous in the private and public sectors—a fact that often gets overlooked as cities pass high-profile laws that purport to ban law enforcement from using the technology. The Markup examined 17 bans passed in the past couple of years, speaking with local officials and reading through official documents. In six of those cities, officials either told The Markup or otherwise publicly stated that loopholes in the bans effectively allow police to access information garnered through facial recognition.

The bans in Pittsburgh; Boston; Alameda, Calif.; Madison, Wis.; Northampton, Mass.; and Easthampton, Mass., all have language in their regulations that may allow local police to continue using facial recognition through state and federal agencies or the private sector.

Some say such loopholes are a good thing: Following the riots in D.C., Massachusetts governor Charlie Baker said keeping facial recognition technology as a tool is necessary precisely because of situations like the Jan. 6 riot. Late last year, Baker pushed for exceptions to a statewide restriction on facial recognition before agreeing to sign the bill. 

Others, however, said laws need to go further and explicitly prevent police from evading bans. 

“The realist in me has no doubt that police departments will try to wedge in any kind of loopholes around the use of this technology or any other sort of tool that they have at their disposal. This is not a surveillance problem; this is a policing problem,” Mohammad Tajsar, a senior staff attorney for the American Civil Liberties Union of Southern California, said. “If you create a carve-out for the cops, they will take it.”

↩︎ link

Different Cities, Different Loopholes

When Pittsburgh passed its ban at a City Council meeting in September, Council Member Ricky Burgess voted for it, though under protest. 

“There’s a part of the legislation that says it doesn’t apply to us using software produced or shared by other police departments,” he said at the meeting. “This does not stop facial recognition.”

The ordinance has a section that notes that the law “shall not affect activities related to databases, programs, and technology regulated, operated, maintained, and published by another government entity.”

Boston, Madison, Wis., and Alameda, Calif., have similar language.

The Alameda Police Department didn’t respond to The Markup’s questions on that city’s ban, but when the ban passed, an assistant city manager testified that the “software could be leveraged as a resource in the scenario of a crime spree involving the Federal Bureau of Investigations [sic],” which uses facial recognition, but “the technology is not something the City of Alameda would be paying for or directly seeking.”

Tyler Grigg, a public information officer for the Madison Police Department, told The Markup that officers can use facial recognition provided by businesses even though it’s banned from government use. 

In Easthampton, Mass., the ban still allows police to use facial recognition as evidence if it comes from another law enforcement agency, but not businesses, Dennis Scribner, a public information officer for the Easthampton Police Department, said. 

Northampton, Mass., police chief Jody Kasper told The Markup that the department could use information from facial recognition provided by both outside agencies and businesses.

There are no carve-outs in the law.

Chad Marlow, ACLU, speaking of The Northampton, Mass., ban

Tali Robbins, policy director for Boston city councilor Michelle Wu, who authored that city’s ban, confirmed that Boston police may have access to facial recognition technology through other agencies. 

Kade Crockford, director of the Technology for Liberty program at the ACLU of Massachusetts, said it can be difficult for police to effectively track where their tips are coming from and ensure that facial recognition wasn’t used.

“We want the ordinances to actually have an impact,” Crockford said. “If they’re too narrow in the sense that they restrict law enforcement conduct like the use of information that comes from an outside agency, we worry there’s a slippery slope that they’ll just ignore it.”

Ultimately, Crockford said, a federal restriction or ban on the technology would best prevent facial recognition from being used at all. 

However, some argue police are misinterpreting the local bans already in place. Chad Marlow, a senior advocacy and policy counsel at the ACLU, told The Markup that Northampton police should not be able to access the technology, through any means, under that city’s ban. 

“They are not allowed to spend any resources, including personnel time, on facial recognition. That’s what the law says. There are no carve-outs in the law,” Marlow said about Northampton’s police chief’s interpretation.

↩︎ link

It’s Not Always Easy to Track How Facial Recognition Gets Used

In many known cases of police using facial recognition, the suspect wasn’t aware police used the technology until much later. 

An NBC Miami investigation found that Miami police arrested protesters using facial recognition. The arrest reports noted only that police had identified suspects using “investigative means,” and even defense attorneys said they were not aware facial recognition was used until approached by NBC.  

Jacksonville police arrested a man for selling $50 of cocaine and identified him by using facial recognition but didn’t disclose the technology’s use in the police report.

“Even when facial recognition is being used in investigations, it’s typically hidden,” Jake Laperruque, a senior counsel at the Constitution Project, said.

Police, for instance, can get tips based on a private business’s use of facial recognition software. Companies like Rite Aid, Home Depot, and Walmart have implemented or tested the technology in their stores. 

Even when facial recognition is being used in investigations, it’s typically hidden.

Jake Laperruque, the Constitution Project

Cities banned facial recognition for police use because of its known bias against people of color and women, and it’s no different when businesses are using the technology, Laperruque said.

“This stuff can be wrong a lot, and it’s especially wrong for people of color,” he said. “If this is something that’s going to lead to a store calling the police on a person, that to me creates a lot of the same risks if you worry about facial recognition misidentifying someone by the police.”

A New York teen recently filed a multimillion-dollar lawsuit against Apple in the Southern District Court of New York, alleging he was misidentified as a chronic shoplifter when Apple’s security firm linked his name to surveillance footage of a different person. The actual shoplifter, according to the complaint, had stolen the teen’s driver’s permit and presented it to security when he was caught shoplifting at multiple Apple stores. 

When New York police officers arrested Ousmane Bah, the lawsuit says, they quickly realized they had the wrong person, telling Bah he was likely “incorrectly identified based on a facial recognition system utilized by Apple or [Security Industry Specialists].”

Apple, which declined to comment on the lawsuit to The Markup, has denied it uses facial recognition software in its stores. 

Information also flows freely among law enforcement agencies that may be operating under different regulations. In San Francisco, the first city to ban the technology, controversy ensued when facial recognition showed up in a criminal case last September. 

San Francisco police had sent out a bulletin request looking for help identifying a gun discharge suspect in a photo. Another law enforcement agency—The Northern California Regional Intelligence Center (NCRIC)—responded with an ID they derived from using PhotoMatch facial identification software. 

NCRIC, a partnership of federal, state, and local departments, does not fall under the jurisdiction of San Francisco’s facial recognition ban, executive director Mike Sena told The Markup, and runs facial recognition searches anytime it gets an identification request. 

“Our job is to help locate bad guys, no matter what city they’re in,” Sena said. “The worst thing I can do is hold onto a potential match.”

Public officials in San Francisco, however, raised a fuss when they became aware of the case, insisting that the city’s ban precluded the department from using NCRIC’s identification. (The SFPD claims several officers recognized the suspect on their own before receiving the facial recognition match.) 

The case is ongoing and scheduled for trial in March.

SFPD public information officer Michael Andraychak told The Markup that going forward, the department “would not be able to use any identification obtained via facial  recognition software.”

↩︎ link

The Portland Model

Last September, Portland, Ore., passed the most comprehensive facial recognition ban to date, prohibiting not only law enforcement use but also use in places of public accommodation (e.g., restaurants and other places open to the general public).

“Once we started doing our due diligence to develop our own policy, we started getting a lot of community feedback and recognizing the role that private businesses are having in connecting people’s information,” said Hector Dominguez, the Open Data Coordinator with Portland’s Smart City PDX.

But there was pushback from industry groups, revealing just how widespread the technology has become. Tech giant Amazon lobbied the city for the first time ever because of the measure, spending $12,000. The Portland Business Alliance asked for carve-outs to the law for airlines, banks, hotels, retailers, concert venues, and amusement parks, while the Oregon Bankers Association asked for exceptions to allow use of facial recognition to provide police evidence in robberies. 

The Portland ban does allow one exception: Businesses and agencies operating within the city may use facial recognition if they say they must do so to comply with federal, state, or local laws (such as Customs and Border Protection, operating at the airport). But businesses, ultimately, were included in the ban—a move ban advocates say was necessary.

“Industry often has more of an onus to surveil than police do in everyday circumstances,” Lia Holland, an organizer in Portland with Fight for the Future, said. “The impunity to save that data forever, to match those faces to customer’s faces, is something that police departments might not have the capacity to do in the same way that a company does.”

We don't only investigate technology. We instigate change.

Your donations power our award-winning reporting and our tools. Together we can do more. Give now.

Donate Now