The Shoddy Science Behind Emotional Recognition Tech
People’s facial expressions line up with their emotions less than half the time
OneZero’s General Intelligence is a roundup of the most important artificial intelligence and facial recognition news of the week.
Facial recognition isn’t just for verifying a person’s identity. In recent years, researchers and startups have focused on other ways to apply the technology, like emotion recognition, which tries to read facial expressions to understand what a person is feeling.
For instance, Find Solution AI, a company based in Hong Kong that was recently featured in CNN Business, is selling its technology to schools and colleges, where it scans students’ faces and monitors their feelings in virtual classrooms. Theoretically, systems like these could detect whether children are paying attention or expressing frustration that indicates difficulty with learning the class material.
Academics and A.I. ethics researchers, however, are quick to point out that this technology relies on questionable science and that there are serious ethical concerns around who the technology is used to surveil.
Kate Crawford, co-founder of the AI Now Institute and senior principal researcher at Microsoft Research, pushed back on Find Solution AI’s claims that its technology could tell what children were feeling.
Find Solution AI, and most other emotion recognition startups, base their technology on the work of Paul Ekman, a psychologist who published popular work on the similarities between facial expressions around the world and popularized the idea of “seven universal emotions.” Actor Tim Roth even played a dramatized version of Ekman in the Fox drama Lie to Me.
That research has not translated well into the real world. A TSA program that trained agents to spot terrorists using Ekman’s work found little scientific basis, didn’t result in arrests, and fueled racial profiling, according to reports from the Government Accountability Office and the ACLU.
A meta-review of 1,000 studies found that the science tying our facial expressions to our emotions isn’t entirely universal. People make the expected facial expression to match their emotional state only 20% to 30% of the time, the researchers said.
But this technology is still being pushed on those who don’t have the power to refuse it. Children in virtual classrooms, job candidates performing virtual interviews, Amazon workers with cameras on them while they deliver packages, and even on people being questioned by police.
“We need to scrutinize why entities are using faulty technology to make assessments about character on the basis of physical appearance in the first place,” researchers from AI Now wrote in their 2019 report.