Perceiving whether someone is sad, happy, or angry by the way he turns up his nose or knits his brow comes naturally to humans. Most of us are good at reading faces. Really good, it turns out.
So what happens when computers catch up to us? Recent advances in facial recognition technology could give anyone sporting a future iteration of Google Glass the ability to detect inconsistencies between what someone says (in words) and what that person says (with a facial expression). Technology is surpassing our ability to discern such nuances.
Scientists long believed humans could distinguish six basic emotions: happiness, sadness, fear, anger, surprise, and disgust. But earlier this year, researchers at Ohio State University found that humans are capable of reliably recognizing more than 20 facial expressions and corresponding emotional states—including a vast array of compound emotions like “happy surprise” or “angry fear.” Recognizing tone of voice and identifying facial expressions are tasks in the realm of perception where, traditionally, humans perform better than computers. Or, rather, this used to be the case. As facial recognition software improves, computers are getting the edge. The Ohio State study, when attempted by a facial recognition software program, achieved an accuracy rate on the order of 96.9 percent in the identification of the six basic emotions, and 76.9 percent in the case of the compound emotions. Computers are now adept at figuring out how we feel.
Much of this kind of computation is based on the so-called Facial Action Coding System (FACS), a method developed by Paul Elkman, a specialist in facial micro-expressions, during the 1970s and 1980s. FACS decomposes emotional expressions into their distinct facial elements. In other words, it breaks down emotions to specific sets of facial muscles and movements: the widening of the eyes, the elevation of the cheeks, the dropping of the lower lip, and so on. FACS is used in the design and construction of characters in animated films. It’s also used by cognitive scientists to identify genes, chemical compounds, and neuronal circuits that regulate the production of emotions by the brain. Such mapping could be used in the diagnosis of disorders like autism or post-traumatic stress disorder, where there is difficulty in recognizing emotions from facial expressions.
Read the full article at The Atlantic.
- Publication Type:Other Writing
- Publication Date:07/09/2015