General Intelligence

New Facial Recognition Tech Only Needs Your Eyes and Eyebrows

You won’t be able to hide behind a mask

Photo illustration. Photo: Alan Rubio/Getty Images

Welcome to General Intelligence, OneZero’s weekly dive into the A.I. news and research that matters.

The term “facial recognition” typically refers to technology that can identify your entire face. How that recognition happens can vary, and can include infrared or lidar technology. Either way, you need the geometry of a person’s entire face to make it work.

But in the coronavirus era, when everyone is advised to wear a mask, exposed faces are increasingly rare. That’s breaking facial recognition systems everywhere, from iPhones to public surveillance apparatuses.

Rank One says it will ship the technology to all of its active customers for free.

Now, facial recognition company Rank One says it has a solution. This week, the company released a new form of facial recognition called periocular recognition, which can supposedly identify individuals by just their eyes and eyebrows. Rank One says the new system uses an entirely different algorithm from its standard facial recognition system and is specifically meant for masked individuals. Rank One says it will ship the technology to all of its active customers for free.

It’s difficult to pinpoint how many companies and government organizations in the United States use Rank One. Its technology is resold by companies with connections to police departments and federal agencies, like DataWorks Plus and Secure Planet. For instance, DataWorks Plus’ contracts in California include access to Rank One’s algorithms, and have the potential to search through 15 million citizens.

As with traditional facial recognition systems, accuracy is a concern. Rank One claims in a controlled environment where the person is facing the camera, the system’s “false non-match rate” (FNMR) — how often the system will miss a match — is only 1.5%. When the target is not directly facing the camera, the FNMR jumps to 15%. Those numbers don’t account for false positives, and the algorithm gets less accurate as it scales, according to the numbers Rank One produced. While 1.5% of 10,000 periocular recognition searches are false negatives, that jumps to 5% when 1,000,000 searches are made.

Rank One isn’t the first company to invest in periocular recognition. Recent research has shown it to be a viable form of facial recognition, though it’s typically been deployed in lab settings with professionally taken images.

We don’t know how widely periocular recognition is used today, but one thing’s clear: Soon enough, you won’t be able to hide behind a mask.

And now, here are some of the most interesting A.I. research papers of the week.

Privacy and Ethics Recommendations for Computing Applications Developed to Mitigate COVID-19

The National Security Commission on Artificial Intelligence, largely staffed by industry insiders and academics, released a white paper on ethical and privacy guidelines for how to deploy algorithms to tackle the coronavirus. More than anything, the document preaches restraint and deliberation to ensure the technologies do not further the coronavirus's disproportionate impact on the poor and communities of color.

CONFIG: Controllable Neural Face Image Generation

Microsoft researchers have built a face generation algorithm that gives an immense amount of control over facial attributes. The algorithm can be told to create a face with its eyes closed, with facial hair, or even one that’s looking in different directions.

The computerization of archaeology: survey on AI techniques

A fascinating oversight of how deeply artificial intelligence is being intertwined with an established field like archaeology. A.I. is currently used for everything from calculating a skeleton’s stature, to grouping pottery fragments for easier reconstruction.

When Machine Unlearning Jeopardizes Privacy

Wiping personal data from algorithms might be harder than once believed. In five tests, researchers found that the “unlearned” data allegedly wiped from image recognition and categorization algorithms could be rediscovered by comparing how the model behaves before and after the data was erased.

Insider Threat Detection Based on Stress Recognition Using Keystroke Dynamics

Researchers explore how an “insider threat” can be found by tracking a computer’s mouse and keyboard, and analyzing keystrokes for differences in movement. The paper posits that illegal actions cause stress, which in turn changes how a person operates a computer.

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store