NakoPhotography/Shutterstock
Source: NakoPhotography/Shutterstock

Microsoft was recently awarded a patent for a pair of glasses that the wearer can use to monitor the emotional states of those around them. But can a machine really comprehend how humans feel? Like other devices on the market, Microsoft’s glasses appear to rely on measurement of facial and vocal expression to determine whether individuals are happy, sad, anxious, angry, or excited.

Despite the enthusiasm for the potential of this and other similar devices, things are not all that they seem. 

The science behind the study of facial expressions and emotion is nothing new. Charles Darwin was the first to suggest that facial expression of emotion was universal across cultures—and genetically based. Classic cross-cultural studies by psychologist Paul Ekman put Darwin’s suggestions on firmer empirical footing; Ekman’s findings and hundreds of subsequent studies have been applied to everything from clinical psychology to law enforcement. Recently, manufacturers created sophisticated computer recognition and analysis methods to automate the processing of emotional information from faces. 

The current media coverage focuses on the value of a technological device that could allow us to know how someone else is feeling by watching them through an overlay that is placed over our own eyes. Most of us already own a pretty powerful piece of technology that works well to signal the emotional state of those we are with: It is called the brain. In fact, the provenance of the computer versions of emotion mining technology is usually established by demonstrating that the machine shows good agreement with a normal human observer. So I doubt the potential for such devices to afford us super-human abilities to be able to figure out what’s going on inside another person's head. Excepting those of us who may suffer from autism spectrum disorder or any other condition that blunts our empathic abilities, we already have such abilities in abundance. (For those of us who do have difficulties in face-reading, this technology will surely be a great boon.)

Companies—mostly aiming their software at marketing professionals—are attempting to make available this valuable data for those who would leverage emotional status to measure the popularity of a product, with the hopes of tweaking its appeal. We have begun to use one such product in our laboratory at the University of Waterloo, and it seems to work reasonably well—though it is easy to fool. My computer perceives that if I mug at it with an exaggerated rictus, I am genuinely happy.

Microsoft's goggles and similar tools offer users the capability to realize a world in which human emotions can be manipulated to produce actions that are not in our best interests. Research suggests that we are more likely to make impulse purchases, perhaps beyond our means, when we feel really good. A Russian company, Synqera, is trying to capitalize on this by offering an emotion-recognition machine that can be placed at check-out counters in stores to monitor the feeling states of shoppers and to tailor special offers to their current set of feelings. Such devices will leave the human observer out of the equation entirely and replace the savvy salesperson who can learn how to prey on our affective states of mind to pry more money from our wallets.

This may work in promoting sales to some extent, but does it offer omniscience? 

Over the past couple of decades, research has transformed our understanding of how affect works. Gone are the days when we believed that what it meant to be human was to take part in an epic battle between our supremely rational cognition (our minds) and a seething morass of reptilian sentiments and feelings (our hearts) that constantly sought to overturn reasoned behavior in favor of our more primal urges. Instead, the modern view is that our emotions are deeply integrated with our thoughts and that, just as Darwin argued, they exist to promote adaptive behavior. One part of this view is that emotions are exquisitely nuanced things, linked with and inseparable from our thoughts, reasoning processes, and decision-making abilities. Further, we are beginning to understand how much of our behavior is determined by the matrix of history, culture, and environment in which any kind of action takes place. To suppose that a machine can monitor the movements of a handful of facial muscles and then interpret the role of emotions in our behavior better than a wise and experienced human observer seems to cheapen something about us that we might be better off cherishing and celebrating.

In our eagerness to accept that we are on the threshold of being superseded by technology—that we can’t beat a machine at chess, or Jeopardy, or other competitions that depend on pattern matching or abundant memory—we sell ourselves short and cheapen our authentic experiences, our profound abilities to understand and empathize. While computing machines surpass human ability in many beneficial ways, accepting that any current machine or computer program can do a credible job of comprehending our feelings seems to trivialize the human experience and spirit.

Follow me on Twitter.

About the Author

Colin Ellard, Ph.D.

Colin Ellard, Ph.D., is the director of the Research Laboratory for Immersive Virtual Environments at the University of Waterloo in Canada, and the author of You Are Here.

You are reading

Mind Wandering

Pokémon Go and the Failure of Urban Design

Why do we need augmented reality to encourage us to take a walk?

Look Up: The Surprising Joy of Raising Your Gaze

When we look upwards we change our patterns of thought.

Can a Building Make You Sad?

New tools can help us to understand the emotional impact of buildings.