“The risk of something horrible happening… is 1 million times higher than [the chance] that the cameras are going to prevent a real situation.”

Dave Gershgorn
OneZero
Published in
1 min readDec 1, 2020

--

The small town of Lockport, New York and its 20,000 residents has become a petri dish for testing facial recognition in schools. In the first weeks of 2020, the district activated cameras in eight schools. They monitor for known sexual predators, expelled students, and analyze images for signs of weapons.

Now, a new report from Vice details how the software has been primed to fail. Black women are 10 times more likely to be misidentified by the system compared to white men. The system also confuses broom handles for guns. And Vice reports that the facial recognition system automatically alerts police when it detects a potential weapon or specific people.

“The risk of an accident, the risk of something horrible happening because the system is structured the way it is, to me, is 1 million times higher than [the chance] that the cameras are going to prevent a real situation,” a parent in Lockport told Vice.

While New York State Legislature has issued a moratorium on facial recognition in schools, Governor Andrew Cuomo is yet to sign the bill, allowing Lockport’s system to continue.

Read more about Lockport’s facial recognition system here:

--

--

OneZero
OneZero

Published in OneZero

OneZero is a former publication from Medium about the impact of technology on people and the future. Currently inactive and not taking submissions.

Dave Gershgorn
Dave Gershgorn

Written by Dave Gershgorn

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.

Responses (2)