General Intelligence

The Shoddy Science Behind Emotional Recognition Tech

People’s facial expressions line up with their emotions less than half the time

OneZero’s General Intelligence is a roundup of the most important artificial intelligence and facial recognition news of the week.

Facial recognition isn’t just for verifying a person’s identity. In recent years, researchers and startups have focused on other ways to apply the technology, like emotion recognition, which tries to read facial expressions to understand what a person is feeling.

For instance, Find Solution AI, a company based in Hong Kong that was recently featured in CNN Business, is selling its technology to schools and colleges, where it scans students’ faces and monitors their feelings in virtual classrooms. Theoretically, systems like these could detect whether children are paying attention or expressing frustration that indicates difficulty with learning the class material.

Academics and A.I. ethics researchers, however, are quick to point out that this technology relies on questionable science and that there are serious ethical concerns around who the technology is used to surveil.

Kate Crawford, co-founder of the AI Now Institute and senior principal researcher at Microsoft Research, pushed back on Find Solution AI’s claims that its technology could tell what children were feeling.

Find Solution AI, and most other emotion recognition startups, base their technology on the work of Paul Ekman, a psychologist who published popular work on the similarities between facial expressions around the world and popularized the idea of “seven universal emotions.” Actor Tim Roth even played a dramatized version of Ekman in the Fox drama Lie to Me.

That research has not translated well into the real world. A TSA program that trained agents to spot terrorists using Ekman’s work found little scientific basis, didn’t result in arrests, and fueled racial profiling, according to reports from the Government Accountability Office and the ACLU.

A meta-review of 1,000 studies found that the science tying our facial expressions to our emotions isn’t entirely universal. People make the expected facial expression to match their emotional state only 20% to 30% of the time, the researchers said.

But this technology is still being pushed on those who don’t have the power to refuse it. Children in virtual classrooms, job candidates performing virtual interviews, Amazon workers with cameras on them while they deliver packages, and even on people being questioned by police.

“We need to scrutinize why entities are using faulty technology to make assessments about character on the basis of physical appearance in the first place,” researchers from AI Now wrote in their 2019 report.

Senior Writer at OneZero covering surveillance, facial recognition, DIY tech, and artificial intelligence. Previously: Qz, PopSci, and NYTimes.

Sign up for Pattern Matching

By OneZero

A newsletter that puts the week's most compelling tech stories in context, by OneZero senior writer Will Oremus. Take a look.

By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices.

Check your inbox
Medium sent you an email at to complete your subscription.

The undercurrents of the future. A publication from Medium about technology and people.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store