New Yorkers living in areas where there is a risk of inspections and searches (” arrest and search ) higher than the police, they are also more susceptible to intrusive facial recognition technologies, a new study by Amnesty International and its partners shows.
New analysis from the global campaign Deny scanning demonstrates that the New York City Police Department’s (NYPD) extensive surveillance operation is particularly affecting people who have already been the targets of stops and searches in the five New York City boroughs.
Banning facial recognition for mass surveillance is an important first step in the fight against racist police.
Matt Mahmoudi, Researcher on Artificial Intelligence and Human Rights at Amnesty International
“Our analysis shows that the use of facial recognition technology by the New York City Police Department (NYPD) contributes to increased discrimination against minorities during police operations in New York,” said Matt Mahmoudi, AI and human rights researcher at Amnesty International.
“We have long known that ID checks and searches in New York are a racist policing method. We now know that the communities most targeted by this method are also at greater risk of discriminatory policing due to intrusive surveillance.
“The shocking scale of facial recognition technologies deployed throughout the city is leaving entire neighborhoods open to mass surveillance. The NYPD must reveal exactly how it uses this intrusive technology.
“Banning facial recognition for mass surveillance is an important first step in the fight against racist police, and the New York City Council should move towards a comprehensive ban now. »
The findings are based on crowdsourced data collected by thousands of digital volunteers from the Decode Surveillance NYC project, which mapped over 25,500 security cameras across New York City. Amnesty International worked with data scientists to compare these data with demographic and detention and retrieval statistics.
Facial recognition technologies for identification purposes are mass surveillance systems that violate the right to privacy and threaten the rights to freedom of assembly, equality and non-discrimination.
The NYPD used these technologies on at least 22,000 occasions between 2016 and 2019. Identity checks and searches conducted by the NYPD since 2002 show that black and Hispanic communities are the main targets.
Amnesty International sued the NYPD last year for refusing to release public documents relating to its acquisition of facial recognition technology and other surveillance tools. This case is ongoing.
New interactive website detailing facial recognition capabilities
Today, Amnesty International is launching a new website that lets users know if they will be followed using facial recognition technology as they drive a specific route between two points in New York.
While driving Black Lives Matter By mid-2020, New Yorkers participating in protests were using facial recognition technology more frequently. For example, a protester walking from the nearest subway station to Washington Square Park is monitored by NYPD Argus cameras throughout the journey.
“Looking at the routes from the nearest subway stations to the protest sites and back, we saw almost complete coverage of public security cameras, most of which are NYPD Argus cameras,” said Matt Mahmoudi.
“The ubiquitous use of facial recognition technologies is effectively digital police control. At protest sites, it is used to identify, track and prosecute people who are simply exercising their human rights.
“This NYPD intimidation tactic is out of place in a free society: it must be stopped immediately. »
The site also allows users to geolocate the facial recognition technology used between major tourist attractions in the city, calculating the distance and possible route.
The widespread use of facial recognition technology is actually digital police control.
Amnesty International is urging New Yorkers to take action by sending a protest letter to an elected municipal official, calling for a bill to ban facial recognition technology to protect their neighborhoods. All users can sign the Amnesty International petition calling for regulation of when and where public facial recognition systems are used.
Partners with whom Amnesty International conducted this study include Julien Cornebiz from the Department of Computer Science (University College London), BetaNYC, a public organization that uses data and technology to hold government accountable, and Damon Vishik, an independent data scientist.
This research marks the last phase of the campaign Deny scanning, following surveillance investigations conducted in New York and Hyderabad in India last year. Amnesty International is calling for a total ban on the use, development, production, sale and export of facial recognition technologies for mass surveillance by both the state and the private sector.