Education Week: Facial Recognition Tech in Schools Prompts Lawsuit, Renewed Racial Bias Concerns
On June 24, 2020, Mark Lieberman of Education Week wrote about the controversies surrounding facial recognition software being implemented in a New York state school district, and other schools. The article discusses legislation introduced by Senator Kavanagh and Assemblymember Monica Wallace and their efforts to halt the use of this technology in New York's schools. The full text of this story is below; the original version is available via the link above.
_______________
Facial Recognition Tech in Schools Prompts Lawsuit, Renewed Racial Bias Concerns
By Mark Lieberman
June 24, 2020
A New York state school district's controversial facial recognition system hit several new snags this week, as the New York Civil Liberties Union filed a lawsuit against the state education department's decision to approve the system, and two state lawmakers called on the district to deactivate the system.
The latest developments in a long unfolding saga come on the heels of intense scrutiny of police departments' use of facial recognition software and growing awareness of the systems' tendency to perpetuate racial bias.
News of the lawsuit was first reported by the Lockport Union-Sun & Journal, whose extensive coverage of the school district's security efforts is among the materials cited in the suit.
On Jan. 2, the Lockport school district near the Canadian border activated a high-tech security system that uses a facial recognition database of flagged individuals to detect intruders. The district has said it began considering using the software after a pro bono visit from a security consultant in 2012 following the elementary school shooting in Newtown, Conn. The consultant had connections to the Canada-based company Aegis, so the district began researching the product and eventually spent $1.4 million of a $4.2 million grant from the state to purchase it.
The state education department last May halted the planned implementation of the system, asking the district for more assurances that it would protect students' privacy. The district and the state reached an agreement at the beginning of this year, and they now contend that because the database does not include students, their data aren't at risk.
Monday's lawsuit against the state department (not the district) argues the opposite: that the system must use students' data in order to verify that they're not included in the database. Hackers could infiltrate the system and use those data for their own purposes, the lawsuit argues.
District superintendent Michelle Bradley did not respond in time for publication to an interview request and email questions from Education Week.
In an op-ed for the Lockport newspaper, plaintiff Jim Shultz, whose daughter attends a Lockport district school, explained the reasoning behind the suit: "The case filed this week by NYCLU is not a demand for money and it is not against the Lockport school district," he wrote. "What the case does is call on New York state education officials to apply the same parental rights and student privacy protections to these high-tech recordings that we use to safeguard other student information—their grades, their teacher comments, or videos of them giving presentations."
More Calls to Deactivate
New York state senator Brian Kavanaugh and state assembly member Monica Wallace on Tuesday issued a statement calling on the district to deactivate the system and the state education department to ban the use of facial recognition software in schools, the Union-Sun & Journal reported Tuesday.
Wallace and Kavanaugh have previously proposed legislation that would establish a statewide ban on facial recognition in schools.
Critics of facial recognition, including Kavanaugh and Wallace, point to an extensive 2019 study of more than 200 systems showing that Black and Asian faces are between 10 and 100 times more likely to be misidentified by facial recognition software than white faces. The software has been controversial for years, even as some school districts explored it as a potential tool in reducing the prevalence of school shootings.
Frustrations with the technology have intensified recently as nationwide protests against racial injustice and police brutality have put police departments' use of the technology under new scrutiny. IBM, one of the world's largest technology companies, announced this month that it will no longer produce, develop, or even research facial technology systems. Other companies, such as Microsoft and Amazon, have temporarily restricted police departments from using their facial recognition tools.
"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe," Arvind Krishna, IBM's CEO wrote in a June 8 letter to Congress calling on more stringent technology regulation. "But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported."
Expert Views
Sarah St. Vincent, a human rights attorney and surveillance and digital rights expert who serves as director of Cornell Tech's Computer Security Clinic, shared with Education Week in December a list of ten questions school districts should ask before moving forward with facial recognition tools.
Among them: How many false positives did the company's testing of the product turn up? Has the manufacturer commissioned a third party to test the system? Have community members been given opportunities to weigh in? Are there options that are less intrusive?
"People of color fought and suffered for the right to be able to get into schools and other buildings the same way that white people do," St. Vincent said in December. "If facial recognition is a barrier to that, that's a problem."
Federal privacy laws that restrict the collection of student data make exceptions for security purposes, "but it's not a free for all," said Linnette Attai, president of PlayWell, a privacy compliance consulting firm with education clients. Other questions worth considering, Attai said, include whether parents have been fully briefed on how their students are being monitored, and whether the facial recognition database is stored on school or company servers.
"There is the law, there's school policy, and there's community norms," Attai said. "All of that needs to come together in order to create smart, sensible policies and practices in a district."