The study aims to integrate motion capture and wearable eye tracking to create a fully ecological setup that quantifies humans' ability to perceive others' gaze directed at different parts of their own body.
The ability to detect if and where on one’s body another human is gazing is of high ecological value and can be decisive in many social situations. Small differences in perceived gaze direction can lead to dramatically different interpretations of another’s intentions and beliefs which affect our own behavioral response accordingly.
Given its profound social relevance, systematic attempts to characterize humans’ ability to perceive others’ gaze directed at the self and across different parts of one’s body are surprisingly few.
Previous studies have almost exclusively focused on perception of others’ gaze towards one’s face, and used computer-generated graphics of heads and eyes with, arguably, limited ecological validity.
The aim of this study is to combine motion capture and wearable eye tracking to develop a fully ecological setup allowing us to quantify humans’ ability to perceive others’ gaze when directed toward different parts of the bodily self.
The experimental setup consists of a wearable eye-tracker (Tobii Glasses Pro 3) synchronized with a motion capturing system (Qualisys Miqus Hybrid, 8 cameras), allowing us to track a person’s 3-D gaze vector and body position in real-time.
Collaboration
The project is conducted in collaboration with Martha Paskin, Ronja Löfström, Mikael Lundqvist and Arvid Guterstam (Karolinska Institutet).