In the last few weeks, a company called Clearview has been in the news for marketing a reckless and invasive facial recognition tool to law enforcement that (they claim) can identify people in billions of photos nearly instantaneously. And Exhibit A in support of their claim to law enforcement that their tool is accurate? An “Accuracy Test” that Clearview boasts was modeled on the ACLU’s work calling attention to the dangers of face surveillance technology.

Imitation may be the sincerest form of flattery, but this is flattery we can do without. If Clearview is so desperate to begin salvaging its reputation, it should stop manufacturing endorsements and start deleting the billions of photos that make up its database, switch off its servers, and get out of the surveillance business altogether.

Clearview’s failed attempt to use the ACLU’s work to support its product exposes a larger industry truth: Companies can no longer deny the civil rights harms of face surveillance in their sales pitch. This harkens back to when we obtained emails between Amazon and the Sheriff’s office in Washington County, Oregon expressing the (100 percent valid) concern that the “ACLU might consider this the government getting in bed with big data.”

Email dated February 15, 2017, from the Washington County Sheriff’s Office to Amazon, obtained by the ACLU.

How did all this start? In May of 2018, the ACLU released the results of an investigation showing that Amazon was selling face surveillance technology to law enforcement. Then, in a test the ACLU conducted in July of 2018 simulating how law enforcement were using the technology in the field, Amazon’s Rekognition software incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime. The false matches were disproportionately of people of color, including six members of the Congressional Black Caucus, among them civil rights legend Rep. John Lewis (D-Ga.). The results of our test were not new. Numerous academic studies prior to that have found demographic differences in the accuracy of facial recognition technology. 

Following our work, members of Congress raised the alarm with Amazon and numerous government agencies over the civil rights implications of facial recognition technology. Cities nationwide are banning government use of the technology as part of ACLU-led efforts starting in San Francisco. In Michigan and New York, activists are fighting to prevent the face surveillance of Black communities, tenants, and school children. And last year, California blocked the use of biometric surveillance with police body cameras.

There is a groundswell of opposition to face surveillance technology in the hands of government. And despite all that, Clearview insists on amassing a database of millions of photos, using those photos to develop a shockingly dangerous surveillance tool, and selling that tool without restriction to law enforcement inside the US and even pushing it on authoritarian regimes throughout the world.

For the record, Clearview’s “test” couldn’t be more different from the ACLU’s work, and leaves crucial questions unanswered. Rather than searching for lawmakers against a database of arrest photos, Clearview apparently searched its own shadily-assembled database of photos. Clearview claims that images of the lawmakers were present in the company’s massive repository of face scans. But what happens when police search for a person whose photo isn’t in the database? How often will the system return a false match? Are the rates of error worse for people of color?

Even more, as we pointed out when we released the findings of our test, an algorithm’s accuracy is likely to be even worse in the real world, where photo quality, lighting, user bias, and other factors are at play. Simulated tests do not account for this. 

There is also no indication that Clearview has ever submitted to rigorous testing. In fact, the rigorous analyses of this technology that have been done, including by the National Institute of Standards and Technology and the pathbreaking academic work by Joy Buolamwini and Timnit Gebru, have shown that facial-analysis technology has serious problems with faces of women and people with darker skin.

Rigorous testing means more than attempting to replicate our test, and protecting rights means more than accuracy. Clearview’s technology gives government the unprecedented power to spy on us wherever we go — tracking our faces at protests, AA meetings, political rallies, churches, and more. Accurate or not, Clearview’s technology in law enforcement hands will end privacy as we know it.

Despite all this, Clearview somehow has the gall to create the impression that the ACLU might endorse their dangerous and untested surveillance product. Well, we have a message for Clearview:

Under. No. Circumstances.

If Clearview is so confident about its technology, it should subject its product to rigorous independent testing in real-life settings. And it should give the public the right to decide whether the government is permitted to use its product at all.

We stand with the over 85 racial justice, faith, and civil, human, and immigrants’ rights organizations, over 400 members of the academic community, Amazon shareholders, Amazon employees, the cities of San Francisco, Oakland, Berkeley, Somerville, Cambridge, and Brookline,  and hundreds of thousands of people who have called for an end to powerful facial recognition technology in the hands of government.

Jacob Snow, Technology & Civil Liberties Attorney, ACLU of Northern California

Date

Monday, February 10, 2020 - 1:15pm

Featured image

A mockup image of how a computer scans a human face for facial recognition

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Privacy

Show related content

Imported from National NID

28648

Menu parent dynamic listing

22

Imported from National VID

28664

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

The Department of Homeland Security has a scary vision for expanding face recognition surveillance into our everyday lives, threatening a dystopian future in which the technology is used throughout our public spaces to scrutinize our identity, check us against watchlists, record our movements, and more. Work on building the infrastructure for this pervasive monitoring has already started, with U.S. Customs and Border Protection currently operating a face recognition system at the gates of departing international flights.  

There is ample reason to be alarmed. Face recognition technology is riddled with bias and inaccuracies, and CBP’s program will likely result in harms ranging from missed flights to lengthy interrogations, or worse. And more broadly, face recognition technology threatens to supercharge Homeland Security’s abusive practices, which have included detaining and interrogating journalists reporting on border conditions, targeting travelers based on national origin, and terrorizing immigrant communities.

Here in the United States, DHS has already laid out — and begun implementing — a very clear plan to expand face surveillance. If we allow the agency to move forward with its plan, there are all too many reasons to think that will lead our society down a dangerous path.

Here is what that pathway looks like, in five steps:

1. Expanding CBP’s existing face recognition system to TSA checkpoints nationwide

CBP’s current program, called the Traveler Verification Service (TVS), is limited to international departure gates at a growing number of U.S. airports. Departing international passengers pose for a photograph at the aircraft gate. The photo is then compared to a pre-assembled gallery, stored in the cloud, of government mug shots (mostly passport and visa photos) of all the passengers registered for that flight. Face recognition is used to make sure the photo of the person posing matches someone in the gallery.

But that’s just the beginning. CBP has started a “demonstration program” aimed at integrating its TVS face recognition program into TSA security checkpoints for passengers who have tickets for “specified international flights.” The TSA is also looking at using CBP’s infrastructure to roll out face recognition for PreCheck travelers. Extending the TVS program beyond aircraft gates to TSA checkpoints and elsewhere would mean building an infrastructure of cameras and devices that could then be scaled up, making it much easier for face scanning to expand.

2. Putting all fliers through the face tracking system

Once CBP’s infrastructure is in place at TSA checkpoints and elsewhere, the government has plans to start tracking the faces of more and more of the over two million passengers who pass through the TSA’s security checkpoints every day — and eventually all. A strategic roadmap that the TSA issued in 2018 directs the agency to move beyond PreCheck passengers and push the general traveling public into face recognition systems. The goal is for these systems to be integrated with other parts of DHS as well as industry partners

3. Making face scans mandatory

Right now, CBP says that submitting to its face surveillance system is optional for American citizens, butthere is ample reason to suspect that the government will want to make the face recognition checks mandatory for all. CBP has already said it plans to make face recognition mandatory for noncitizens. A very similar process happened with the TSA’s body scanners: When they were new and controversial, the agency emphasized that they were voluntary, but after controversy died down, TSA quietly made them mandatory.

4. Running faces against watchlists

Once face surveillance becomes entrenched at TSA checkpoints, there will be even more pressure to turn those checkpoints into broader law enforcement checkpoints where people are subject to criminal, immigration, and intelligence watchlist checks. Already CBP said it planned to start running some passenger photos through a biometric watchlist. As such checks expand, pressure will build to try to identify everyone from parole violators to deadbeat dads. And as the number of watchlist checks increases, so would the number of random Americans who get mistaken for somebody on those watchlists.

5. Expanding beyond the airport

If face surveillance becomes pervasive in airports, we can expect to see it expand outward. Airport bag searches were new in American life when they were first introduced in the 1960s and 1970s, and since then, they’ve expanded throughout American life to many office buildings, schools, museums, sports stadiums, and public gatherings. Face recognition, too, is likely to follow this path toward the “airportization of American life.”

In China, the government has installed face surveillance checkpoints at key ports of entry to track and target ethnic minorities, and monitor people across the country. We don’t want to see anything like that happen in our country. CBP’s TVS program is the first government face recognition checkpoint in American history, and if we decide to let its deployment continue, where will that lead? We don’t have to wonder because the government has already told us much of the story. But there’s still a lot more the public needs to know, which is why we’ve asked the government to turn over documents about the program’s implementation and future. At the same time, we’re calling on Congress to press pause on the use of face surveillance for law enforcement and immigration enforcement purposes before it forever alters our free society.

An ACLU white paper on the expansion of CBP’s face recognition program is available here.

Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project

Date

Thursday, February 6, 2020 - 5:30pm

Featured image

A passenger using a facial recognition kiosk in the background with a U.S. Customs and Border Protection officer watching in the foreground.

Show featured image

Hide banner image

Tweet Text

[node:title]

Share Image

ACLU: Share image

Related issues

Police Practices Privacy

Show related content

Imported from National NID

28613

Menu parent dynamic listing

22

Imported from National VID

28624

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Pages

Subscribe to ACLU of Florida RSS