Member-only story
Amazon’s A.I. Emotion-Recognition Software Confuses Expressions for Feelings
People’s faces say a lot less about their emotions than companies think

In August, Amazon announced it had improved “accuracy for emotion detection” in its facial-recognition software. Not only could the technology better detect seven emotions (happy, sad, angry, surprised, disgusted, calm, and confused), the company explained, but it could for the first time detect an eighth emotion: fear.
Major tech companies like Microsoft and Apple, along with startups like Kairos and Affectiva, sell similar emotion-detecting products. This relatively new service category is, by one estimate, projected to become a $25 billion industry by 2023.
Technology that reads emotions based only on facial expressions would be an astonishing breakthrough. It could, for instance, allow marketers to survey viewers’ reactions to horror movies, built-in car software to assess whether a driver is angry or drowsy, and companies to weed out bored or uninterested job candidates. It might even help children with autism learn to recognize others’ emotions. But in recent months, scientists have raised considerable doubt that using faces to read emotions is even possible.
The problem isn’t that technology like Amazon’s Rekognition fails to read the details of faces. It’s that faces are not necessarily accurate signals of emotions.
In July, neuroscientist and distinguished professor at Northeastern University Lisa Feldman Barrett led a review of more than 1,000 studies that concluded many developments in artificial intelligence and computer vision that aim to detect emotions are misguided. And last month, a computer science professor at the University of Southern California named Jonathan Gratch and his colleagues presented two papers at the Eighth International Conference on Affective Computing and Intelligent Interaction that called for a pause on some “emotion analytics” techniques.
The problem isn’t that technology like Amazon’s Rekognition fails to read the details of faces. It’s that faces are not necessarily accurate signals of emotions.