General Intelligence

A Facial Recognition Giant Refuses to Share Details About Its Algorithm Dataset

NEC claims its systems aren’t biased — but rejects calls for transparency

Dave Gershgorn
OneZero
Published in
3 min readJul 2, 2020

--

Photo illustration. Photo: David Ramos/Stringer/Getty Images

The South Wales Police have been using live facial recognition since 2017 and claim to be the first department to make an arrest in the United Kingdom with the technology. Officers have surveillance cameras mounted on top of a large white van; the cameras record pedestrians and try to figure their identities.

While many facial recognition programs are hidden from the public eye or revealed only through public records requests, the U.K. has been touting its programs, publicizing the use of NEC’s technology by the South Wales Police and London’s Metropolitan Police.

Now, a lawsuit from U.K. human rights group Liberty is challenging the use of the software, claiming that it’s ineffective and racially biased.

How the U.K. determines facial recognition can be used within its borders could set a precedent for other European countries, just as the first cities in the United States to ban the technology have done.

Privacy advocates in the United States have been pushing a more radical solution: You don’t have to worry about biased data…

--

--

Dave Gershgorn
OneZero

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.