• Aucun résultat trouvé

The influence of emotion and empathy on gaze patterns when exploring controlled static and ecological dynamic faces

N/A
N/A
Protected

Academic year: 2021

Partager "The influence of emotion and empathy on gaze patterns when exploring controlled static and ecological dynamic faces"

Copied!
2
0
0

Texte intégral

(1)

HAL Id: hal-03268874

https://hal.archives-ouvertes.fr/hal-03268874

Submitted on 23 Jun 2021

HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

The influence of emotion and empathy on gaze patterns when exploring controlled static and ecological dynamic

faces

Antoine Coutrot, Astrid Kibleur, Marion Trousselard, Barbara Lefranc, Céline Ramdani, Karolina Stepien, Déborah Varoqui, Jonas Chatel Goldman

To cite this version:

Antoine Coutrot, Astrid Kibleur, Marion Trousselard, Barbara Lefranc, Céline Ramdani, et al.. The influence of emotion and empathy on gaze patterns when exploring controlled static and ecological dynamic faces. Vision Science Society (VSS), May 2021, St Pete, Florida, United States. Vision Sciences Society Annual Meeting Abstract. �hal-03268874�

(2)

The influence of emotion and empathy on gaze patterns

when exploring controlled static and ecological dynamic faces

A. Coutrot ab *, A. Kibleur c , M. Trousselard d , B. Lefranc d , C. Ramdani d , K. Stepien b , D. Varoqui c & J. Chatel Goldman c

a

LIRIS, CNRS, Université de Lyon, France

b

LS2N, CNRS, Université de Nantes, France

c

Open Mind Innovation, France

d

Institut de Recherche Biomédicale des Armées, France

* corresponding author: antoine.coutrot@liris.cnrs.fr

Introduction

The influence of facial emotions on gaze patterns when exploring faces is still debated.

Previous research reported that the relative proportion of fixations on the different face areas is (1,2) or is not (3,4) modulated by the expression processed. While most previous studies used static face images or simulated dynamic facial expressions (3), we propose to test how these findings generalize to more ecological spontaneous dynamic expressions of emotion.

ace f the of out

ace f the of rest

mouth nose nasion

e y e ight r

e y e

0 10 20 30 40 50 60

0 0.2 0.4 0.6

FixationRate

left

0 0.4 0.8 1.2 1.6 2 2.4 0.6

0.4

0.2

0

Fixation Rate

Analysis Window First Fixation

Time (s) a - Regions of Interest

c - Fixation Rate Across Emotion

b - Fixation Rate Across Time

Left Eye Right Eye Mouth

Nasion Nose Rest of the Face

Figure 1 - Eye movements in static stimuli | a- Average static face with the Regions of Interest (ROIs). b- Fixation rate in each ROI across time, averaged across all static stimuli. Fixation rates have been averaged within 40 ms time windows (to simplify the plot, the curve markers do not cor- respond to the sampling rate). Error bars represent standard errors. b- For each emotion, averaged across time within the analysis window. The horizontal dashed lines represent the fixation rate for the Neutral condition. References

[1] Eisenbarth & Alpers. Happy Mouth and Sad Eyes: Scanning Emotional Facial Expressions. Emotion 2011 [2] Schurgin et al. Eye movements during emotion recognition in faces. Journal of Vision 2014

[3] Blais et al. Eye Fixation Patterns for Categorizing Static and Dynamic Facial Expressions. Emotion 2017

Methods

We recorded the eye movements of 170 participants, while they categorized the valence of static and dynamic emotional faces. Static emotions were performed by actors from the classic Karolinska Directed Emotional Faces database (5), while dynamic emotions were genuine natural facial expressions from ordinary people, filmed in natural but standardized conditions (DynEmo database, (6)). Participants completed a questionnaire to evaluate their empathy profile. We used the Questionnaire of Cognitive and Affective Empathy (7) and clustered partici- pants into 4 empathy profiles: Mature (N=55, 15 males), Affective (N=45, 25 males), Cognitive (N=44, 30 males), and Low (N=22, 15 males).

0 0.4 0.8 1.2 1.6 2.0

Time (s)

Fixation Rate

0.5 0.4 0.3 0.2 0.1 0.0

0 0.1 0.2 0.3

0 0.1 0.2 0.3 ace

f the of out

ace f the of rest

mouth nose nasion

e y e ight r

e y e

0 2 3 4 5

left

0 0.4 0.8 0 0.4 0.8

Time (s) Time (s)

0.3 0.2 0.1 0.0

0.3 0.2 0.1 0.0

Surprise Neutral

Disgust Happy

video onset emotion onset

a - Example frames with ROIs

b - Average across all videos

c - Average across each emotion

Figure 2 - Eye movements in static stimuli | a- Four illustrative frames of a ‘happy’ dynamic face with the Regions of Interest (ROIs). b- Fixation rate in each ROI across time, averaged across all dynamic stimuli.

Dynamic stimuli have been aligned on the beginning of the video. c- Fixation rate in each ROI across time, averaged within each emotion. Dynamic stimuli have been aligned on the beginning of the emotion (neutral starts have been randomly sampled in the video). Fixation rates have been averaged within 40 ms time windows (to simplify the plot, the curve markers do not correspond to the sampling rate). Error bars represent standard errors.

0.6 0.4 0.2

Fixation Rate 0.0

Left Eye

0.6 0.4 0.2 0.0

Right Eye 0.6

0.4 0.2 0.0

Mouth

0.3 0.2 0.1

Fixation Rate 0.0

Left Eye

0.3 0.2 0.1 0.0

Right Eye 0.3

0.2 0.1 0.0

Mouth

Effect of emotion in static faces Effect of emotion in dynamic faces Effect of empathy in static faces

Effect of empathy in dynamic faces

Mature Affective Cognitive Low

Mature Affective Cognitive Low Mature Affective Cognitive Low

Figure 3 - Effect of empathy on eye movements | For each empathy profile, fixation rate per ROIs averaged across time within the analysis window. The horizontal dashed lines re- present the fixation rate for the Mature profile.

Conclusions

Our results suggest that moderate differences in gaze behavior like the ones associated with the observer’s empathy profile can generalize from a classic and well controlled static dataset, to a more ecological and dynamic dataset.

Furthermore, we did not find any effect of gender on fixation rates.

This suggests that the previously reported stronger left eye bias in fe- males [8,9] may well be the due to women being on average more em- pathetic than men.

[7] Reniers et al. The QCAE: A questionnaire of cognitive and affective empathy. Journal of Personality Assessment 2011 [8] Coutrot et al. Face exploration dynamics differentiate men and women. Journal of Vision 2016

[9] Sokhn et al. A left eye bias for female faces. IEEE Knowledge and Smart Technology

Results

We found strong similarities between the gaze patterns in static (Fig 1) and dy- namic (Fig 2) conditions.

We used Linear Mixed Models with fixation rate in ROI as response, gender, emotion, and empathy profile as fixed effects and participants id as random ef- fect. We found a main effect of emotion on fixation rate on all facial regions of in- terest (left and right eye, nasion, nose, mouth, rest of the face).

In both static and dynamic stimuli, participants in the Mature empathy group gazed more at the left eye than participants in the Low empathy group. In static stimuli Hedge’s g = 0.50, 95%CI=[0.43 0.56], in dynamic stimuli Hedge’s g = 0.34, 95%CI=[0.22 0.46].

[4] de Boer et al. Eyes on Emotion: Dynamic Gaze Allocation During Emotion Perception From Speech-Like Stimuli. Multisensory Research 2020 [5] Lundqvist et al.The Karolinska directed emotional faces (KDEF) Karolinska Institutet 1998

[6] Tcherkassof et al. DynEmo: A video database of natural facial expressions of emotions. The International Journal of Multimedia 2013

Références

Documents relatifs

Having access to all sources via a Linked Data interface, we can devise a Data-Fu program [16] that collects the data and evaluates queries over the data, while taking into account

Finally we estimate from the above that the glass transition is well described in the framework of usual scaling theories as a succession of aborted second

To see the benefit of using the proposed KTD approach in DOPs compared to the arbitrary approach to promote diversity via the replacement of individuals in the population by new

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des

We performed repeated measures ANOVA with Feature Weights (Static Saliency, Dynamic Saliency, Center Bias, Uniform, and Faces) and Auditory Conditions (Original, Unrelated

This change of reference is a problematical way to build the identity of a character, his image for the reader ; it is made from different mixed points of view : the external point

Late N170 (bottom) revealed the first interaction between gaze direction and gesture: the condition in which the actor looked and pointed at the participant induced greater

Unit´e de recherche INRIA Rennes, Irisa, Campus universitaire de Beaulieu, 35042 RENNES Cedex Unit´e de recherche INRIA Rhˆone-Alpes, 655, avenue de l’Europe, 38330 MONTBONNOT ST