Is it possible to decipher how we feel from our movements? How can emotions be studied “from the outside” using empirical methods? To answer these questions, a large international and interdisciplinary research team led by the Max Planck Institute for Empirical Aesthetics (MPIEA) in Frankfurt am Main, Germany, has developed an integrative scientific methodology. Using artistic and digital media such as motion capture technology, the researchers developed the EMOKINE software to measure the objective kinematic characteristics of movements that express emotions. The results of the study have recently been published in the journal Behavioral Research Methods.
The team had a professional dancer repeat short dance choreographies in front of a green screen. She was asked to express different emotions through her movements: anger, satisfaction, fear, happiness, neutrality and sadness. To capture the dance movements as “data”, the scientists dipped into the MPIEA’s technological data bank: the dancer wore a full-body motion capture suit from XSENS®, equipped with a total of 17 highly sensitive sensors. In combination with a film camera, the dynamic body movements were measured and recorded. The researchers then extracted objective kinematic features (motion parameters) and programmed the EMOKINE software, which provides these motion parameters from data sets at the push of a button.
Computerized tracking of whole body motion
A total of 32 statistics of 12 motion parameters were collected and extracted from a pilot dance dataset. Recorded kinematic parameters were, for example, velocity, acceleration or limb contraction.
“We identified 12 kinematic features of whole-body emotional movements that have been analyzed separately in the previous research literature. We then extracted them all from a single dataset and subsequently fed them into the EMOKINE software,” reports first author Julia F. Christensen from MPIEA.
Motion tracking has been used in many fields in recent years because the objective recording of motion parameters can provide information about people’s intentions, feelings, and mood. However, this research requires a theory-based methodology in order to draw meaningful conclusions from the recorded data.
“This work shows how artistic practice, psychology and computer science can work together in an ideal way to develop methods for studying human cognition,” says co-lead author Andrés Fernández of the Max Planck Institute for Intelligent Systems in Tübingen, Germany.
The methodological framework that accompanies the software package, which explicitly uses dance movements to study emotions, departs from previous research approaches, which often used video clips of “emotional actions” such as hand waving or walking.
“We are particularly excited about the publication of this work, which involved so many experts, for example from Goethe University Frankfurt am Main, the University of Glasgow and a film team from WiseWorld Ai, Portugal. It has brought together disciplines from psychology, neuroscience, computer science and empirical aesthetics, but also from dance and film,” sums up lead author Gemma Roig, Professor of Computer Science, Computer Vision and Artificial Intelligence Laboratory at Goethe University.
The open source software package
EMOKINE is freely available on ZENODO and GitHub and can be adapted to other motion capture systems with minor modifications. These freely accessible digital tools can be used to analyse the emotional expression of dancers and other groups of performers, as well as everyday movements.
The researchers hope that the EMOKINE software they have developed will be used in experimental psychology, affective neuroscience, and computer vision, especially in AI-assisted visual media analysis, a branch of AI that enables computers and systems to extract meaningful information from digital images, videos, and other visual data. EMOKINE will help scientists answer research questions about how kinematic parameters of whole-body movements convey different intentions, feelings, and moods to the observer.