- Non-white neighborhoods in the Bronx, Brooklyn and Queens live with more CCTV surveillance
- New interactive website details exposure to invasive technology
New Yorkers living in areas at greater risk of stop-and-frisk by police are also more exposed to invasive facial recognition technology (FRT), new research by Amnesty International and partners has revealed.
New analysis as part of the global Ban The Scan campaign has proven how the New York Police Department’s vast surveillance operation particularly affects people already targeted for stop-and-frisk across all five boroughs of New York City. In the Bronx, Brooklyn, and Queens, the research also showed that the higher the proportion of non-white residents, the higher the concentration of facial recognition compatible CCTV cameras.
“Our analysis shows that the NYPD’s use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City,” said Matt Mahmoudi, Artificial Intelligence and Human Rights Researcher at Amnesty International. “We have long known that stop-and-frisk in New York is a racist policing tactic. We now know that the communities most targeted with stop-and-frisk are also at greater risk of discriminatory policing through invasive surveillance.”
The NYPD used FRT in at least 22,000 cases between 2016 and 2019. Data on incidents of stop-and-frisk by the NYPD since 2002 shows Black and Latinx communities have been the overwhelming target of such tactics.
“The racial bias and harm of facial recognition technology as a policing tool is well-documented. As the work of the #BanTheScan campaign in New York has demonstrated, the technology can be used as the latest iteration of bias-based policing,” said New York City Public Advocate, Jumaane Williams. “Facial recognition is a tool that can violate the rights of New Yorkers. The algorithms used have proven biased and flawed, and implementation will conform to the systemic biases we have long seen. Now is the time to ban this practice, not advance it – we cannot repeat the mistakes of the past with new technology.”
During the Black Lives Matter movement of mid-2020, New Yorkers attending protests experienced higher levels of exposure to FRT. For example, a protester walking from the nearest subway station to Washington Square Park would be under surveillance by NYPD Argus cameras for the entirety of their route.
“The pervasive use of facial recognition technology is effectively a digital stop-and-frisk. The use of mass surveillance technology at protest sites is being used to identify, track and harass people who are simply exercising their human rights,” added Mahmoudi. “This is a deliberate scare tactic by the NYPD that has no place in a free society, and must be stopped immediately.”
Amnesty International is today also launching a new website that allows users to discover how much of any potential walking route between two locations in New York City might be exposed to FRT surveillance. The website also allows users to track how much FRT is used between any of the major tourist attractions in the city by plotting the distance and possible route taken.
Amnesty International encourages New Yorkers to take action by sending a letter of protest to their council member demanding the introduction of a bill that prohibits FRT to help protect their communities.
The findings are based on crowdsourced data obtained by thousands of digital volunteers as part of the Decode Surveillance NYC project, who mapped more than 25,500 CCTV cameras across New York City. Amnesty International worked with data scientists to compare this data with statistics on stop-and-frisk and demographic data.
Research partners Amnesty International worked with include: Julien Cornebise of the Department of Computer Science, University College London; BetaNYC, a civic organization using data and technology to hold government to account; and Dr Damon Wischik, an independent data scientist.
Last year, Amnesty International sued the NYPD after it refused to disclose public records regarding its acquisition of FRT and other surveillance tools. The case is ongoing.
Amnesty International is calling for a total ban on the use, development, production, sales, and export of FRT for mass surveillance purposes by both states and the private sector.
Contact: Gabby Arias, [email protected]