Human faces pop up on a screen, hundreds of them, one after another. Some have their eyes stretched wide, others show lips clenched. Some have eyes squeezed shut, cheeks lifted and mouths agape. For each one, you must answer this simple question: is this the face of someone having an orgasm or experiencing sudden pain?
Researchers are increasingly split over the validity of Ekman’s conclusions. But the debate hasn’t stopped companies and governments accepting his assertion that the face is an emotion oracle — and using it in ways that are affecting people’s lives. In many legal systems in the West, for example, reading the emotions of a defendant forms part of a fair trial. As US Supreme Court judge Anthony Kennedy wrote in 1992, doing so is necessary to “know the heart and mind of the offender”.
Another reason for the lack of evidence for universal expressions is that the face is not the whole picture. Other things, including body movement, personality, tone of voice and changes in skin tone have important roles in how we perceive and display emotion. For example, changes in emotional state can affect blood flow, and this in turn can alter the appearance of the skin. Martinez and his colleagues have shown that people are able to connect changes in skin tone to emotions. The visual context, such as the background scene, can also provide clues to someone’s emotional state.
Barrett thinks that more data and analytical techniques could help researchers to learn something new, instead of revisiting tired data sets and experiments. She throws down a challenge to the tech companies eager to exploit what she and many others increasingly see as shaky science. “We’re really at this precipice,” she says. “Are AI companies going to continue to use flawed assumptions or are they going do what needs to be done?”