Facial recognition technology 'reinforces discriminatory policing' in New York

University College London

New Yorkers living in areas at greater risk of stop-and-search by police are also more exposed to invasive facial recognition technology, according to a new analysis involving UCL researchers and led by Amnesty International.

In the Bronx, Brooklyn and Queens, the research also showed that the higher the proportion of non-white residents, the higher the concentration of facial recognition compatible CCTV cameras.

The findings are based on crowdsourced data obtained by thousands of digital volunteers as part of the Decode Surveillance NYC project, which mapped more than 25,500 CCTV cameras across New York City.

Dr Julien Cornebise (Honorary Associate Professor at UCL Computer Science), who helped design and implement the crowdsourcing project, said: "It's always a privilege to work with Amnesty International, supporting their incredible expertise of social change with our own expertise of data and algorithms. A project like this is truly unique. It joins human rights experts, a literal crowd of volunteers from all over the world, and our own academic rigour of quantitative analysis and algorithms, all united to shape the impact of technology on society."

The research marks the latest phase of Amnesty International's Ban The Scan campaign, following investigations into surveillance in New York and Hyderabad in India last year. Amnesty International is calling for a total ban on the use, development, production, sales, and export of facial recognition technology for mass surveillance purposes by both states and the private sector.

According to Amnesty International, the New York Police Department (NYPD) used facial recognition technology in at least 22,000 cases between 2016 and 2019. Data on incidents of stop-and-search by the NYPD since 2002 shows Black and Latinx communities had been the overwhelming target of such tactics, Amnesty has said.

The latest research involved Dr Cornebise, BetaNYC, a civic organisation using data and technology to hold government to account, and Dr Damon Wischik, an independent data scientist.

Matt Mahmoudi, Artificial Intelligence and Human Rights Researcher at Amnesty International said: "Our analysis shows that the NYPD's use of facial recognition technology helps to reinforce discriminatory policing against minority communities in New York City.

"We have long known that stop-and-frisk in New York is a racist policing tactic. We now know that the communities most targeted with stop-and-frisk are also at greater risk of discriminatory policing through invasive surveillance.

"The shocking reach of facial recognition technology in the city leaves entire neighbourhoods exposed to mass surveillance. The NYPD must now disclose exactly how this invasive technology is used.

"Banning facial recognition for mass surveillance is a much-needed first step towards dismantling racist policing, and the New York City Council must now immediately move towards a comprehensive ban."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.