Facial recognition is a fundamental part of self-image and social interactions. In an age of advanced digital technology, we are faced with intriguing questions about communication and identity. How does altering our facial identity affect our sense of self and our interactions with others? These are questions that Dr. Shunichi Kasahara, a researcher at the Cybernetic Humanity Studio at the Okinawa Institute of Science and Technology (OIST), is investigating by real-time morphing of facial images (turning our faces into someone else’s and vice versa). The studio was established in 2023 as a platform for joint research between OIST and Sony Computer Science Laboratories, Inc.
Dr. Kasahara and his collaborators have investigated the dynamics of face recognition using visual-motor synchrony (the coordination between a person’s physical movements and the visual feedback they receive from those movements). They found that regardless of whether or not we influence the movement of our self-image, levels of identification with our face remain constant. Therefore, our sense of agency, or subjective feelings of control, do not affect our level of identification with our self-image. Their results have been published in Scientific reports.
The agency effect on identity perceptions
Using psychological experiments using screens and cameras, scientists investigated where the “self-identification boundary” lies and what influences it. Participants were asked to sit and watch screens on which their faces were shown gradually changing. At a given moment, participants could notice a change in their facial identity and were asked to press a button when they felt that the image on the screen was no longer their own. The experiment was conducted in both directions: the image changed from self to other and from other to self.
“It’s like looking at your face in a mirror while you move it and identify yourself, but your face slowly changes to a point where you realize it’s not you anymore,” Dr. Kasahara explained.
The researchers examined how three motion conditions affect facial boundary: synchronous, asynchronous, and static. They hypothesized that if the movements were synchronized, participants would identify with the images to a greater extent. Surprisingly, they found that regardless of whether the movements were synchronized or not, their facial identity boundaries were similar. Furthermore, participants were more likely to identify with static images of themselves than with images with their faces in motion.
Interestingly, the direction of the morphing (either from self to other or other to self) influenced how participants perceived their own facial boundaries: participants were more likely to identify with their facial images when these images morphed from self to other rather than from other to self. Overall, the results suggest that the sense of autonomy of facial movements does not significantly affect our ability to judge our facial identity.
“Let’s consider the example of deepfakes, which are essentially a form of asynchronous movement. When I remain still but the visual representation moves, an asynchronous situation is created. Even in these deepfake scenarios, we can experience a sense of identity connection with ourselves,” Dr. Kasahara explained. “This suggests that even when we see a fake or manipulated version of our image – for example, another person using our face – we can still identify with that face. Our findings raise important questions about our perception of ourselves and identity in the digital age.”
How does identity affect perceptions of control?
And what about the other way around? How does our sense of identity affect our sense of autonomy? Dr. Kasahara recently published a paper in collaboration with Rikkyo University psychology professor Dr. Wen Wen, who specializes in research on our sense of autonomy. They investigated how recognizing oneself through facial features might affect how people perceive control over their own movements.
During the experiments, participants watched their own or another person’s face on a screen and were able to interact with and control facial and head movements. They were asked to watch the screen for about 20 seconds while moving their faces and changing their facial expressions. Face movement was controlled by either their own face and head movement alone or by an average of the participant’s and the experimenter’s movement (full control vs. partial control). They were then asked “how much did you feel this face was like you?” and “how much control did you feel over this presented face?”
Once again, the main findings were intriguing: participants reported a greater sense of control over the “other face” rather than the “own face.” Furthermore, controlling another person’s face resulted in a greater variety of facial movements than controlling one’s own face.
“We gave participants a different face, but they could control the facial movements of that face, similar to deepfake technology, where AI can transfer motion to other objects. This AI technology allows us to go beyond the conventional experience of simply looking in a mirror, allowing us to unravel and investigate the relationship between facial movements and visual identity,” said Dr. Kasahara.
“Based on previous research, one might expect that if I see my own face, I would feel more control over it. Conversely, if it’s not my face, I might expect to feel less control because it’s someone else’s face. That’s the intuitive expectation. However, the results are the opposite: When people see their own face, they report a lower sense of autonomy. In contrast, when they see someone else’s face, they are more likely to feel a sense of autonomy.” These surprising results challenge what we thought we knew about how we see ourselves in pictures.
Dr. Kasahara stressed that the acceptance of technology in society plays a crucial role in technological advancements and human evolution: “The relationship between technology and human evolution is cyclical; we evolve together. But concerns about certain computer technologies can lead to restrictions. My goal is to help foster acceptance within society and update our understanding of the ‘self’ in relation to human-computer integration technology.”