Professor identifies brain signature for negative emotion

One of the images Chang used in the study to evince negative emotions. (Source:  Flickr, Credit: AndreasS)

One of the images Chang used in the study to evince negative emotions. (Source: Flickr, Credit: AndreasS)

Professor Luke Chang at Dartmouth College has identified a brain signature that can predict emotion with impressive accuracy. By applying machine learning techniques to fMRI data, he mapped disparate regions of the brain that work together to produce strong emotions. These findings could one day be used as ‘benchmarks’ for measuring human emotion (1).

Emotion is a large part of everyday life. Throughout evolution, emotion helped to keep humans alive, but it can be harmful if not kept under control in the modern day. Emotional affective disorders include anxiety, depression, and psychosis. To better understand these disorders, neuroscientists such as Chang try to understand how the brain represents emotion (1).

Neuroscientists usually study emotion by inducing it in a patient and then using fMRI or EEG to measure how the brain’s response. These studies can, at best, correlate brain states with emotional states, but they are not good at reverse inference – the process of predicting a person’s emotional state given their brain state. To make accurate predictions about emotions, Professor Chang used machine-learning techniques to analyze data about individuals’ brain states and make predictions about their emotional states (1).

The study consisted of 183 subjects who viewed pictures in an fMRI scanner and rated their emotional responses. Researchers chose the pictures to induce various levels of aversion (1).

When asked if he could predict my emotional arousal from my fMRI data, Change revealed the strength of the signal’s predictive power. He explained, “If we just took two picture sets – the ones that are very neutral and the ones that cause high arousal – we’d be able to tell you with almost 100% accuracy what you are seeing” (2). Of the 183 participants in the study, Chang’s signature correctly predicted high/low arousal every time (1).

Popular neuroscience has a tendency to emphasize discoveries of “trust regions” or “love regions” of the brain. In reality, complex brain states are not localized to just one region. The brain signature that Professor Chang studies was distributed across many regions of the brain including the amygdala, medial prefrontal cortex, insula, and parts of the brainstem including the periaqueductal grey matter, the visual cortex, and the ventral temporal lobe (parahippocampal gyrus) (1). Some of these regions have long been known to play a role in emotion, but other areas were surprising. Chang suspects that the visual cortex has a role in basic image processing and that the prefrontal cortex plays a role in understanding what happens in pictures (2).

In the short term, these results contribute to researchers’ understanding of how the brain represents emotion. Ultimately, Chang hopes that research will be able to use the signal to measure different types of emotion. “We’ll be able to measure them so we can see how much different people are experiencing certain things,” he said enthusiastically. He continued by explaining that researchers “can also track how they are getting treated and if they’re getting better or worse, like a blood sample” (2). Emotional disorders such as anxiety are currently impossible to measure, but, by developing signatures of emotion, clinicians may be able to measure how individual patients respond to therapy.

While neuroscience is still far from being able to read minds, this research is a step in the right direction.

 

References

  1. Chang LJ, Gianaros PJ, Manuck SB, Krishnan A, Wager TD (2015) A Sensitive and Specific Neural Signature for Picture-Induced Negative Affect. PLoS Biol 13(6): e1002180. doi: 10.1371/journal.pbio.1002180
  2. Professor Luke Chang, personal communication, August 14, 2015)

 

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *