Emotion detection: the idea that a computer can read your facial expressions and body language to figure out how you're feeling. It's a compelling idea for software developers in a huge range of markets. So much so that some experts estimate the market for emotion detection software will be as much as $3.8 billion by 2025. That's rather tidy economic growth, especially for a field that's not exactly supported by a wealth of success.
Emotion detection purports to use technology to read people's emotional state based on objective physical input. By reading information like facial expressions, body language, eye tracking, heart rate, and respiratory rate, emotion detection software attempts to make a conclusion about an individual's actual emotions. The vast majority of emotion detection software uses facial expressions to draw conclusions about the emotions of the person in the eye of the camera.
Emotion detection can't work reliably, according to a group of researchers who just published their findings after reviewing thousands of psychological studies spanning several decades of emotion/physical expression research.
"Facial configurations...are not 'fingerprints' or diagnostic displays that reliably and specifically signal particular emotional states," the paper conclude:
"It is not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts."
Some facial expressions--like a smile--are fairly universal across all cultures. Other expressions are so ambiguous they could mean a wide variety of things: a frown could mean you're sad, frustrated, angry, or simply that you're concentrating. That's true for a huge range of individuals across the spectrum of cultural and circumstantial contexts. Even the meaning of a smile can be less than clear: it can be a grimace of pain or a sarcastic laugh. According to researchers, the level of variation across people's facial expressions and body language makes those features alone a notably unreliable method of predicting the emotions of a person on the other side of a camera lens.
Emotion detection isn't a brand-new field. It operates on much the same premise as a polygraph (lie detector) test, by assuming that certain objective characteristics always hold true for the people in question. Unfortunately (or perhaps fortunately), it's not that simple. Physical movements, characteristics, and tendencies don't always correlate with a specific emotion or intention, and relying on them too much can be disastrous.
This research flies in the face of a multi-million dollar industry churning out sparkling new emotion detection software for a massive array of software applications from video games to the medical field. What does the future of emotion detection hold? It's impossible to know for sure, but for now, the best emotion detector out there is still the imperfect, inconsistent one operating moment-by-moment in your own brain.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.