General Intelligence

A Facial Recognition Giant Refuses to Share Details About Its Algorithm Dataset

NEC claims its systems aren’t biased — but rejects calls for transparency

Dave Gershgorn
OneZero
Published in
3 min readJul 2, 2020

--

Photo illustration. Photo: David Ramos/Stringer/Getty Images

The South Wales Police have been using live facial recognition since 2017 and claim to be the first department to make an arrest in the United Kingdom with the technology. Officers have surveillance cameras mounted on top of a large white van; the cameras record pedestrians and try to figure their identities.

While many facial recognition programs are hidden from the public eye or revealed only through public records requests, the U.K. has been touting its programs, publicizing the use of NEC’s technology by the South Wales Police and London’s Metropolitan Police.

Now, a lawsuit from U.K. human rights group Liberty is challenging the use of the software, claiming that it’s ineffective and racially biased.

How the U.K. determines facial recognition can be used within its borders could set a precedent for other European countries, just as the first cities in the United States to ban the technology have done.

Privacy advocates in the United States have been pushing a more radical solution: You don’t have to worry about biased data being used to train a facial recognition algorithm if you ban facial recognition entirely. As companies like Amazon, Microsoft, and IBM either pause or step away from police facial recognition contracts and municipalities like San Francisco and Boston ban the technology, the movement against facial recognition is undoubtedly growing in the United States.

A rebuke in the U.K. could bolster U.S. activists’ movements into one that’s worldwide, especially for NEC, which has more than 1,000 contracts around the globe.

NEC’s response to the lawsuit has lacked detail, to say the least, according to The Register.

The company has allegedly refused to provide any details of what data is used to train its algorithms to determine one face from another, and the police using the technology don’t know how the algorithm was trained. There’s reason for concern: A trial of NEC’s technology in 2018 had a 98% failure rate, and a…

--

--

Dave Gershgorn
OneZero

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.