A person’s facial expression provides us with crucial information for us to recognize their emotions. But there is much more to this process than that. This is according to research carried out by Dr. Leda Berio and Professor Albert Newen of the Institute of Philosophy II of the Ruhr University of Bochum, Germany. The team describes emotion recognition not as a separate module, but as part of a comprehensive process that helps us form an overall impression of another person. This process of forming personal impressions also includes physical and cultural characteristics, as well as background information. The article was published on September 24, 2024 in the magazine. Philosophy and Phenomenological Research.
Understanding the situation affects the way we recognize emotions.
In the 1970s, the theory was proposed that the face is the window to our emotions. Researcher Paul Ekman described basic emotions such as fear, anger, disgust, joy and sadness using typical facial expressions, which turned out to be similar across cultures. “However, in recent years it has become increasingly apparent that there are many situations in life in which a typical facial expression is not necessarily the key information that guides our assessment of other people’s feelings,” notes Newen and cites the following example: “People almost universally rate a typical fearful facial expression as anger when they have prior knowledge that a waiter turned away the person being evaluated even though he had demonstrably reserved a table.” In such a situation, people expect the person to be angry, and this expectation determines the perception of their emotion, even if their facial expression would normally be attributed to a different emotion.
“In addition, sometimes we can recognize emotions even without seeing the face; for example, the fear felt by a person who is attacked by a growling dog, even if we only see them from behind, in a posture of freezing or fear,” illustrates Berio. .
Recognizing an emotion is part of our overall impression of a person
Berio and Newen propose that recognizing emotions is a subprocess of our ability to form an overall impression of a person. In doing so, people are guided by certain characteristics of the other person, for example, physical appearance characteristics such as skin color, age, and sex, cultural characteristics such as clothing and attractiveness, as well as situational characteristics such as facial expression, gestures and posture.
Based on such characteristics, people tend to quickly evaluate others and immediately associate social status and even certain personality traits with them. These associations dictate how we perceive other people’s emotions. “If we perceive a person as a woman and they show a negative emotion, we are more likely to attribute the emotion to fear, while in a man it is more likely to be interpreted as anger,” says Berio.
Background information is included in the evaluation.
In addition to the perception of initial characteristics and associations, we also have detailed personal images that we use as background information for people in our social circle: family, friends, and colleagues. “If a family member has Parkinson’s, we learn to value that person’s typical facial expression, which seems to indicate anger, as neutral, because we are aware that a rigid facial expression is part of the disease,” says Berio.
The basic information also includes models of people from typical professional groups. “We have stereotypical assumptions about the social roles and responsibilities of, for example, doctors, students and workers,” says Newen. “In general, we perceive doctors as less emotional, for example, which changes the way we evaluate their emotions.”
In other words, people make use of the wealth of characteristics and prior knowledge to evaluate another person’s emotions. Only in rare cases do they read emotion solely in a person’s facial expression. “All of this has implications for emotion recognition using artificial intelligence (AI): it will only be a reliable option when AI does not rely solely on facial expressions, which is what most systems currently do,” says Newen.