The ability to recognize and respond to emotionally charged situations is essential to the evolutionary success of a species. A new study published today [July 9th] in Nature Communications Our understanding of how the brain responds to emotionally charged objects and scenes is advancing.
The research, led by Trinity College Dublin neuroscientist Professor Sonia Bishop and Google researcher Samy Abdel-Ghaffar while he was a PhD student in Professor Bishop’s lab at the University of California, Berkeley, has identified how the brain represents different categories of emotional stimuli in a way that allows for more than a simple “approach-avoidance” dichotomy in guiding behavioural responses. The research was funded by the US National Institutes of Health.
Sonia Bishop, current Chair of the Department of Psychology in Trinity’s School of Psychology and senior author on the paper, explains: “It is critically important for all species to be able to recognise and respond appropriately to emotionally relevant stimuli, whether that is not eating rotten food, running away from a bear, approaching an attractive person in a bar or comforting a crying child.
“How the brain enables us to respond in nuanced ways to emotionally charged situations and stimuli has long been studied. However, little is known about how the brain stores schemas or neural representations to support the nuanced behavioral choices we make in response to naturally occurring emotional stimuli.
“Neuroscientific studies of motivated behavior typically focus on simple approach or avoidance behaviors, such as pressing a lever to obtain food or changing location to avoid an electric shock. However, when confronted with natural emotional stimuli, humans do not simply choose between ‘approach’ or ‘avoid’; they instead select from a complex range of appropriate responses. Thus, for example, our ‘avoidance’ response to a large bear (leave the area as quickly as possible) is different from our ‘avoidance’ response to a weak, sick animal (do not get too close). Similarly, our ‘approach’ response to the positive stimuli of a potential mate differs from our ‘approach’ reaction to a cute baby.
“Our research reveals that the occipital temporal cortex is not only tuned to different categories of stimuli, but also breaks down these categories based on their emotional characteristics in a way that is well suited to guiding selection among alternative behaviors.”
The research team from Trinity College Dublin, the University of California Berkeley, the University of Texas at Austin, Google and the University of Nevada Reno analysed the brain activity of a small group of volunteers as they viewed more than 1,500 images depicting natural emotional scenes, such as a couple hugging, an injured person in a hospital bed, a luxurious house and an aggressive dog.
Participants were asked to classify the images as positive, negative or neutral and to rate the emotional intensity of the images. A second group of participants chose the behavioural responses that best matched each scene.
Using cutting-edge modeling of brain activity broken down into tiny cubes (less than 3mm)3) The study found that the occipital temporal cortex (OTC), a region at the back of the brain, is fine-tuned to represent both the type of stimulus (a single human, a couple, a crowd, a reptile, a mammal, a food, an object, a building, a landscape, etc.) and the emotional characteristics of the stimulus, whether it is negative, positive, or neutral and also whether it has high or low emotional intensity.
Machine learning showed that these stable tuning patterns were more efficient at predicting behaviors corresponding to the images of the second group of participants than could be achieved by applying machine learning directly to the image features, suggesting that the OTC efficiently extracts and represents the information needed to guide behavior.
Google’s Samy Abdel-Ghaffar commented: “For this project, we used voxel-wise modeling, which combines machine learning methods, large datasets, and encoding models, to gain a much more precise understanding of what each part of the OTC represents than traditional neuroimaging methods. This approach allowed us to explore the intertwined representation of categorical and emotional scene features, and opened the door to a new understanding of how OTC representations predict behavior.”
Professor Bishop added: “These findings extend our knowledge of how the human brain represents natural emotional stimuli. Furthermore, the paradigm used does not involve a complex task, making this approach suitable in the future, for example, to better understand how people with a variety of neurological and psychiatric conditions process natural emotional stimuli differently.”
More about the study method:
The team used a large new dataset of 1,620 natural emotional images and performed functional magnetic resonance imaging on adult human volunteers, acquiring over 3,800 3D images of brain activity as participants viewed these images. Participants rated these images based on their valence (positive, negative, or neutral) and their level of arousal (or emotional intensity).
By modeling this data using tiny 2.4 x 2.4 x 3 mm chunks, or “voxels,” of brain activity, the researchers found that regions of the occipital temporal cortex at the back of the brain showed differential representation of both the stimulus’s semantic category and affective value. For example, high-positive-arousal faces were represented in slightly different regions than high-negative-arousal faces and low-neutral-arousal faces.
Furthermore, when a completely new group of participants was asked to select behaviors that were consistent with each image, the higher dimensions of this neural coding representation “space” better predicted the selected behaviors than higher dimensions based directly on image features (e.g., is the stimulus animate-positive?). This suggests that the brain chooses what information is important or not to represent and maintains stable representations of subcategories of animate and inanimate stimuli that integrate affective information and are optimally organized to support the selection of behaviors for different types of emotional natural stimuli.