Decode Surveillance NYC is complete!
Thank you to the more than 7,000 digital volunteers who identified thousands of surveillance cameras across New York City, and helped reveal where people are more likely to be tracked by facial recognition technology.
Each intersection was analysed at least three times.
Volunteers analyzed every intersection in New York City, excluding express ways.
Most volunteers came from the United States, Nigeria, Pakistan, United Kingdom, and Bangladesh.
If every person had stood at an intersection, they would have covered an area larger than the Bronx.
That’s the equivalent of an Amnesty researcher working full-time for nearly two years.
Posts created in the Amnesty Decoders discussion forum.
In the next phase of the project, Amnesty will work on an in depth analysis of the crowdsourced data. Our data science team will use statistical quality metrics to further test its accuracy and reliability. We will also be working with an architect, a sociologist, and local partner organisations. Our goal is to produce findings that will help us to better understand the possible impacts of facial recognition technology in New York City, especially along lines of race and socioeconomic inequity. We plan to publish the results at the end of the year alongside the raw data and analysis code.
The campaign to ban facial recognition in New York, Ban the Scan, will be using the findings from Decode Surveillance NYC to pressure the newly elected City Council to introduce a bill calling for a ban.
15 September 2020
Amnesty International USA send Freedom of Information Law (FOIL) request to NYPD
26 January 2021
29 January 2021
NYPD declines FOIL request
1 March 2021
Amnesty appeals FOIL decision
15 March 2021
Amnesty’s appeal is denied by the NYPD
14 April 2021
Decode Surveillance NYC launches and crowdsourcing begins
3 June 2021
Amnesty International publishes press release with early findings.
25 June 2021
Crowdsourcing is completed
July – September 2021
In depth analysis of the crowdsourced data and further research
October – November 2021
Decode Surveillance NYC findings published
Despite the call for racial justice, Black Lives Matter protests have been the site of extensive facial recognition technology use. AFP via Getty Images
About the project
Facial recognition technology can track who we are, where we go, and who we know. The technology is being used by police and private companies all over the world, in ways that often erode our human rights.
In New York City, the use of facial recognition technology (FRT) disproportionately impacts Black and other minority communities and threatens the right to protest. If we can stop authorities there from using this privacy-invasive and discriminatory technology, we will send a powerful message to governments across the world to do the same.
The New York City Police Department (NYPD) has used facial recognition technology in 22,000 cases since 2017—half of which were in 2019. Yet we do not know where, when or why. Despite mounting evidence that facial recognition technology violates human rights, key data on its use has not been made public, in spite of numerous freedom of information requests filed by Amnesty International.
Amnesty International and its partners are leading Ban the Scan, a campaign to outlaw the use of FRT by all government agencies in New York City.
FRT threatens our right to freedom of expression. While the NYPD claims the technology is deployed to solve only the most serious crimes, news reports tell a different story of its use to prosecute minor offences, including in cases of graffiti or shoplifting. In the summer of 2020, it was likely used to identify and track a participant at a Black Lives Matter protest, Derrick ‘Dwreck’ Ingram, who allegedly shouted into a police officer’s ear.
Government agencies' use of facial recognition disproportionately impacts Black and Brown people because they are at greater risk of being misidentified and so falsely arrested. Even when accurate, facial recognition is harmful – law enforcement agencies with problematic records on racial discrimination, are able to use facial recognition software, to double down on existing targeted practices e.g. among Black and other minority communities.
To help bring about a ban, we need more data.
In September 2020, with help from New Yorkers and in coalition with many local organizations, Amnesty asked the NYPD for more information on its use of FRT. They have so far failed to respond.
Take part using your phone or computer
Help generate the missing data
Decode Surveillance NYC gives people the opportunity to help generate the missing data. Our goal is to locate closed circuit TV (CCTV) cameras across New York City. Knowing where cameras are helps Amnesty and its partners understand where FRT can be used.
Imagery from any camera, regardless of its age, can be fed into facial recognition software. This software then compares the imagery with large databases containing millions of images, many scraped from social media without the user’s consent.
The NYPD has invested in facial recognition tools, including software seemingly provided by Clearview AI and DataWorks1. It uses a surveillance system developed by Microsoft, known as the Domain Awareness System, which gives police officers access to an estimated 9,000 live feeds from public and private cameras — feeds which, when combined with other cameras and facial recognition software, can be used to track the face of anyone in New York City.
Around the world many cities are only one software upgrade away from using FRT. Help us generate the missing data so that we can #banthescan.
What will I do?
As a volunteer, you will be shown a Google Street View image of a New York City intersection. We will ask you to study the image. If you see any CCTV cameras we will ask you to tag them and tell us what they are attached to—for example, traffic signals or streetlights.
Once you have tagged a camera, we may ask you to identify its type. This is straightforward and you will be given help. It is not a problem if you cannot identify the camera.
Many images will not contain cameras. This is not a bad thing. We want to know where there are no cameras, too.
Mapping cameras has been done before, notably by the New York Civil Liberties Union in 2006. This survey was completed by volunteers who walked the streets of New York City counting cameras. We are doing something similar, just online. Taking the activity online means that we can widen access to involve thousands of volunteers from around the world.
Anyone with a mobile phone or computer as well as an internet connection can participate. No contribution is too small. We have prepared a video tutorial to show you what to do, and there is also a help section with more examples along with an active forum where you can ask our moderators and researchers for help or discuss with other volunteers.
Why this campaign now?
This project is a call to action coinciding with the first anniversary of the killing of George Floyd that led to the second wave of Black Lives Matter protests. Despite the call for racial justice, these protests have been the site of extensive facial recognition uses—a technology that is widely recognised as amplifying racially discriminatory policing--has undermined the protests’ calls for racial justice.
It will also coincide with the start of the New York City Mayoral primaries, to help Amnesty and our coalition partners once and for all make a strong statement on the extent of the risks of facial recognition, pointing to a limited data-set of coordinates where New Yorkers’ faces might be picked up by facial recognition. This will be impossible to ignore – especially as protesters will once again take to the streets calling for racial justice. A ban on facial recognition is racial justice.
In New York City the devastating effects of FRT are well known. However, we have little understanding of the true extent of its usage, as well as the camera infrastructures that connect to it. Amnesty and its coalition partners have applied pressure on the NYPD to provide more information but have been blocked every step of the way. With this Decoders project, we are taking matters into our own hands. By inviting communities to help us tag the physical infrastructure that feeds facial recognition software, we will demonstrate how New Yorkers’ freedoms of assembly and expression are constantly under surveillance and threat, and pressure policymakers at the city and state levels to ban FRT. The data output from the project will serve as an educational resource as well as a powerful tool to influence legislation.
By enabling residents to see the infrastructure of FRT in their own communities, and hear testimonies about its role in police violence, we can change the narrative. FRT should be seen as a shameful and abusive endeavour and a full ban is the only way to uphold human rights.
The impact of our work so far
Ban the Scan launched on 26 January 2021 and has already hit several milestones. The campaign has brought together a coalition of a dozen local organizations in New York City, presenting a uniquely united front in our call for a citywide ban on facial recognition. Through the campaign so far, we have been able to drive over 7,000 individuals to submit comments to the NYPD, rejecting their facial recognition policy and calling for a ban.
We have been in touch with a number of sympathetic state and city legislators, who have indicated that they are considering supporting a ban. We recently launched a letter and call action targeted at New York City Council Speaker Corey Johnson, calling on him to support the introduction and passing of a bill to ban facial recognition.
The campaign has been covered by national and international media outlets, including CBS, NY Daily News, Fast Company, The Verge, Business Times, Wired, TechWorldNews, The Guardian and The Economist.
1Data pulled by Amnesty from GovSpend
Decode Surveillance NYC is an Amnesty Decoders project. Amnesty Decoders is a global network of over 50,000 digital volunteers from more than 150 countries working to support human rights research.
Decode Surveillance NYC is part of a worldwide campaign to outlaw the use of facial recognition technology. The New York City chapter of the Ban the Scan campaign is brought to you in partnership with New York City-based privacy, civil liberties and human rights organizations, who have called for a ban on government use of facial recognition technologies:
AI for the People, CryptoHarlem, Electronic Frontier Foundation, Immigrant Defence Project, New York Civil Liberties Union, Office of the New York City Public Advocate, RADA STUDIO, Reclaim Your Face, Warriors in the Garden, STOP Surveillance Oversight Technology Project, WITNESS.
Amnesty International would like to thank the more than 7,000 volunteers who contributed to Decode Surveillance NYC.
The Amnesty Decoders’ team would like extend a special thank you to the volunteer moderators: @Emc2, @Fritz, @Francesca_F, @hanny123, @Ionized, @jhsia, @j_juliano, @MRF, @NajamK, @ajeancleux, and @Wallace. The volunteer team was led by @Sofz. Thank you all for creating and maintaining such a welcoming and supportive discussion forum.
A huge thank you, too, to everyone who made an exceptional contribution to the project, including @Janya, @shane28, @BraydenSmith, @Pmcvicker, @Fritz, @Jans, @SelinaJW, @Emc2, @Matthieub, @Sander, @NancyC21, @Francesca_F, @hmod6780, and @sdemonte, who each analysed over one thousand intersections. As well as @ecg1, @PJSmith, @IceTea, @mariaeriksson, @Kchien, @Etienne91, @MRF, @lemonTree68, @audreyjennings25, @Renee_More, @DoctorScience, @Jadolecek, @hanny123, @BK0307730, @dana271, @jlorraine439 who each analysed over 500 intersections. Thank you and many others who analysed hundreds of images and participated in conversations in the forum.
Decode Surveillance NYC was designed and developed in collaboration with Xpon Digital. The site is powered by Hive, an open-source crowdsourcing platform developed by the New York Times and Discourse, an open-source discussion platform.
The project would not have been possible without the generosity and expertise of our external collaborators: Julien Cornebise (data science lead), Swetha Pilai (data science), Dr Victoria Tse (scoping), Sofia Caferri (project assistant), Eliana Rogers (illustration), Martyna Marciniak (3D modelling), Rebecca Echevarria (tutorial video presenter), Andy West (promotional video production), and Rebecca O'Neil (copywriter).
Thank you to everyone who has offered advice, ideas, and inspiration so far, including the New York Civil Liberties Union, BetaNYC, and Marena Brinkhurst from the Mapbox Community Team.
A few minutes of your time can help, on your computer or mobile. Thank you for taking part and supporting this cause.REGISTER FOR FUTURE PROJECTS