Skip to main content

Verified by Psychology Today


Exploring Emotion Recognition With Brain-Machine Interfaces

A new brain imaging technique will soon facilitate qualitative emotion research.

Key points

  • Emotions are subjective experiences, making them challenging to qualify in scientific settings.
  • Researchers from Tianjin University have developed a brain scan system to track patients' emotions.
  • Emotion recognition could have applications in mental health treatment, healthcare communication, and more.
Source: vecstock / Freepik
Source: vecstock / Freepik

One of the great mysteries of humankind is the perception of emotion. We understand the parts of the brain where emotions come from and the chemical reactions that produce them, but feelings remain a subjective experience, differing from person to person. They are a black box from which we can only estimate the other's emotional perception.

Some are better than others at this estimation. For instance, behavioral therapists and psychologists specialize in understanding the emotional perspective of others and use counseling as a method of helping the individual understand their emotional self.

However, emotions are challenging to qualify in scientific settings. Traditional emotion qualification methods include questionnaires, facial expressions, and other techniques that are far from statistically sound.

For the first time, we may have a method of cracking through the black box and understanding emotion through scientifically sound qualifications. New emotion recognition software using brain-machine interface technologies may enable researchers to take a more accurate and universal approach to emotion recognition.

In a recent study for Cyborg and Bionic Systems, Xiaopeng Si and colleagues from Tianjin University in China describe a brain imaging technique that can now be applied to emotion recognition research. While traditional methods are subjective and unexact, tracking emotions using brain-machine interfaces may introduce more objectivity to research.

The two-part system described by Si and colleagues is simple and elegant. First, the researchers gather brain images from the patient using functional near-infrared spectroscopy (fNIRS). This imaging technique measures changes in blood oxygenation levels in the brain, measured and illustrated as neural activity brain scans.

fNIRS provides real-time high spatial resolution of the brain without the inconvenience of a CAT scan or MRI machine. fNIRS is typically conducted without much discomfort with a wired or wireless cap worn by the patient.

After the brain scan is collected, the data is sent through a dual-branch joint network.

The first of the dual branches is a spatial analysis, which looks at the data from the individual's brain scan and determines whether the associated emotion is positive, negative, or neutral based on areas of activation in the brain.

The second branch is the statistics branch, which looks at more intricate details, such as the time it takes for a brain signal to move through the different sections of the brain. Different emotional signals move through the brain in different ways. Using this data, the statistics branch makes its assumption about the positivity, negativity, or neutrality of the emotion.

Both branches then jointly conclude the nature of the individual's emotion.

To test their emotion recognition system, the researchers showed a panel of 18 subjects 24 videos of varying emotional stimulation: six happy, six sad, and 12 neutral (as predetermined by a separate rating committee).

Each patient watched the 24 videos while connected to the fNRIS system, with sufficient buffer time between each video. The brain scans were then analyzed by the dual-branch joint network system.

The results were presented in two forms. Could the dual-branch joint network system correctly distinguish positive versus neutral versus negative emotions? Then, in a two-category format, could the system correctly distinguish between positive versus neutral and separately negative versus neutral?

In both cases, the system succeeded. Between all three emotions, the system was 74.8 percent accurate. In the two-category format, the system was 89.5 percent accurate when distinguishing positive versus neutral and 91.7 percent accurate for negative versus neutral.

While these results are promising, the question remains: How can something like this be used for everyday applications?

There are a few immediate medical applications and some speculative long-term uses. First and foremost, as discussed earlier, using emotion recognition in scientific research will allow for less subjective and more exact measurement of emotional feedback in research settings.

An additional medical use would be in mental health diagnosis and treatment. Emotion recognition software may be used to analyze patients with depression, anxiety, or post-traumatic stress disorder (PTSD) and then use the analysis results to customize treatment options.

Some more speculative applications lie outside the world of medicine.

In future artificial intelligence applications, computers may interpret and respond to human emotion. Further, in marketing and advertising, consumer analysts may develop ad campaigns around the emotional responses of a target demographic.

The applications are widespread, though far off. For now, we can marvel at the advances in the medical field we observe with the Si emotion recognition system and admire the renaissance of scientific advances we find ourselves today and in the coming years.

More from William A. Haseltine Ph.D.
More from Psychology Today