Senator Brad Hoylman Announces New Legislation To Protect Civil Liberties By Banning The Use Of Facial Recognition Technology By Law Enforcement
January 27, 2020
Hoylman: “New York must take action to regulate this increasingly pervasive and dangerously powerful technology, before it’s too late.”
New Legislation Announced Days After Major Expose On Clearview AI, A Biometric Surveillance Company With A Massive Facial Recognition Database Reportedly Used By Some NYPD Officers
NEW YORK—Today, State Senator Brad Hoylman (D/WF-Manhattan), chair of the New York Senate Judiciary Committee, will introduce new legislation to prohibit law enforcement from using facial recognition and other biometric surveillance technology, which poses a serious threat to the privacy and civil liberties of all New Yorkers. Multiple instances of law enforcement abusing facial recognition technology have been uncovered in recent months, including strong evidence that law enforcement manipulates images, stores photos of minors, and uses the controversial app created by Clearview AI.
Senator Hoylman said: “Facial recognition technology threatens to end every New Yorker’s ability to walk down the street anonymously. In the wrong hands, this technology presents a chilling threat to our privacy and civil liberties – especially when evidence shows this technology is less accurate when used on people of color, and transgender, non-binary and non-conforming people. New York must take action to regulate this increasingly pervasive and dangerously powerful technology, before it’s too late.”
Senator Hoylman’s legislation would prohibit any police agency, police officer, peace officer, or member of the state police from acquiring, possessing, accessing, installing, activating or using any biometric surveillance system while in the use of their job duties or with regard to any information obtained, processed, or accessed in the course of those duties. The bill does not restrict the use of other existing lawful practices involving the use of biometric information by law enforcement, such as the state’s well-regulated DNA index and fingerprints used in the state identification bureau. In addition, this bill would create a Task Force to study the issue and recommend standards for use of the technology if it were to be allowed in the future.
Jerome Greco, Supervising Attorney of the Digital Forensics Unit at The Legal Aid Society, said: “Facial recognition technology and other forms of biometric surveillance are inaccurate, pervasive, easily abused, and a direct threat to New Yorkers’ privacy and civil liberties. Our clients and other underserved communities have long suffered from the harmful effects of surveillance. New York State must prevent law enforcement from using advances in technology to further this disparate impact. The Legal Aid Society thanks Senator Hoylman for introducing this important legislation and we urge Albany to enact it immediately.”
Michael Sisitzky, Lead Policy Council at the New York Civil Liberties Union, said: “Biometric recognition systems like face surveillance are unethical and wildly inaccurate, and they have no place in the hands of law enforcement. There’s overwhelming research showing facial recognition’s inability to accurately identify women, young people, and people of color. And when these tools are used by law enforcement, the consequences can be devastating. This legislation provides critical protections for New Yorkers against harmful and discriminatory technologies that do little more than subject communities of color to wrongful targeting, interrogation, detention, and conviction.”
The use of facial recognition technology and other kinds of biometric surveillance by law enforcement is becoming increasingly common, posing massive ethical concerns. This month, news reports uncovered the enormous scope of Clearview AI’s facial recognition app, which relies on a database of more than 3 billion images taken from individuals’ profiles on social media websites like Facebook and YouTube; it was later reported that certain NYPD officers have used Clearview’s app as well. Over the past year, multiple instances of law enforcement abusing facial recognition power have been uncovered, including photo manipulation and evidence that NYPD has been storing thousands of photos of children and teenagers. Concerns regarding overuse of this technology are not theoretical: China currently maintains a massive network of surveillance cameras capable of using automated facial recognition to monitor the movement of people within its borders; it has been used with particularly devastating consequences against ethnic minorities such as the Uighurs.
Evidence shows that facial recognition technology is highly inaccurate when analyzing the faces of people of color and other vulnerable populations. One study found that commercially available facial recognition algorithms “consistently perform worse” when being asked to recognize women, African Americans and younger people. Transgender, non-binary and gender non-conforming individuals are also regularly misidentified, according to research from University of Colorado Boulder.
Share this Article or Press Release
Newsroom
Go to NewsroomSenator Brad Hoylman-Sigal 2024 West Side Town Hall
October 1, 2024
Analysis of NYC Traffic Congestion and Emergency Response Times
September 20, 2024