• Aucun résultat trouvé

Synthesis and Integration of the Main Findings

Chapter 6. Brain Activation Lateralization in Monkeys (Papio anubis) following

1. Synthesis and Integration of the Main Findings

General Discussion

1. Synthesis and Integration of the Main Findings

From the human recognition of emotions in conspecific and NHP vocalizations to the passive listening of affective calls in baboons, the present thesis investigated through an evolutionary perspective these phenomena in the auditory modality.

Study 1 indeed, crucially demonstrated using fNIRS the difference of mechanisms during the explicit categorization and discrimination of emotions and the implicit processing of those in human voice. In accordance with previous fMRI findings showing a distinct involvement of the IFC in biased and unbiased choices (Dricu et al., 2017); and literature on frontal bilateral approach of emotions (e.g. Davidson, 1992; Frühholz & Grandjean, 2013b;

Grandjean, 2020; Schirmer & Kotz, 2006), we found brain modulations of the bilateral IFC depending on the categorization or the discrimination of angry and fearful contents in implicit (word recognition task) and explicit (emotional identification task) auditory decoding (Figure 31). These results are supported by behavioural data (Figures 27 & 28) showing that participants in explicit task were more accurate to categorize than discriminate but spent a longer time before selecting their answer, while in implicit task, participants were better to discriminate than categorize words and spent less time before answering.

Therefore, our findings suggest separated cerebral mechanisms for the vocal recognition of emotions relying on the level of complexity (number of possible choices) in perceptual decision-making. However, it remains unclear if such processes are unique to the human voice or are also at play in heterospecific vocalizations.

In order to clarify this point, Study 2 revealed the existence of similar processes for the human recognition of affect in calls expressed by other primate species namely chimpanzees, bonobos and rhesus macaques. fNIRS data indeed showed a modulation of the bilateral PFC and IFGtri depending on the categorization and discrimination of affects in all primate species (humans included) (Figure 38). Differences in affective decision-making, at least in auditory processing, seem thus to exist independently of the primate species that expressed the vocalizations. Interestingly, further analyses demonstrated that the correct categorization of agonistic chimpanzee and bonobo screams as well as affiliative

chimpanzee calls were associated with a decrease of activity in bilateral PFC and IFCtri. On the contrary, the accurate discrimination of agonistic chimpanzee screams was correlated to bilateral enhancement of PFC and IFGtri (Figure 37). Although these last findings seem counterintuitive, they could be linked to several explanations. First, the categorization of agonistic vocalizations expressed by great apes, our closest relatives, could involve inhibition mechanisms to reduce the possibly induced high level of stress. Frontal regions are indeed the most sensitive brain areas to stress exposure (Arnsten, 2009). Second, affiliative chimpanzee calls would be perceived as agonistic due to their loud and low frequency characteristics (Kelly et al., 2017; Kret et al., 2018). Such acoustic parameters are in fact, usually associated to aggressive behaviours (Briefer, 2012; Morton, 1977). Third, the enhancement of activity in frontal regions for agonistic chimpanzee vocalizations could rely on distinct mechanisms between the categorization and discrimination tasks in cross-taxa recognition. Indeed, the simpler choice between A versus non-A (compared to categorization) would not involve inhibition processes relying on stress reduction.

Finally, the level of choice complexity also affected participants’ behaviour. In fact, to the exception of threatening bonobo calls, human participants were able to discriminate all affective cues in all primate species; for the categorization task, they were unable to do it for macaque vocalizations (Figure 36). It seems that the low level of complexity involved in discrimination processing compared to more complex categorization mechanisms, allowed participants to discriminate more correctly affective vocalizations of all primates, including species with higher phylogenetic distances from humans such as monkeys. Regarding the non-recognition of threatening cues in bonobo calls, participants could be indeed biased by the peaceful nature of the species (Gruber & Clay, 2016) that might be related to the higher F0 of their screams (Grawunder et al., 2018).

Overall, similar brain and behavioural mechanisms seem involved in perceptual affective decision-making in conspecific and heterospecific vocalizations. Moreover, our data suggest that the acoustic, as the phylogenetic proximity, could play a crucial role in cross-taxa recognition.

Assessing both the role of phylogenetic and acoustic distances in the human recognition of affects in primate vocalizations, Study 3 importantly revealed strong acoustic similarities between affective chimpanzee calls and human voice (Figure 42). Consequently, participants were more accurate to categorize and discriminate affective cues in chimpanzee

vocalizations compared to bonobo or macaque screams (Figures 43 and 44). These results highlight the importance of acoustic features in vocal emotion recognition in heterospecific vocalizations. In fact, despite their phylogenetic proximity with Homo sapiens, bonobo calls were not recognized as well as chimpanzee calls by human participants. The peculiar evolutionary pathway of bonobos (Hare et al., 2012; Staes et al., 2018) leading to behavioural (Gruber & Clay, 2016) and vocal expression divergences (Grawunder et al., 2018) would prevent human participants from recognizing for instance, threatening cues in their calls.

Furthermore, in line with Study 2, participants were able to discriminate distressful and affiliative macaque calls while in the categorization task they were only capable of doing so for affiliative cues. These results emphasize first, differences between simple and complex choice mechanisms; and second, distinction between forced-choice and Likert scale in recognition performance. In fact, the ability of humans to accurately identify affects in macaque screams is controversial. Most comparative studies using arousal or valence rating actually failed to demonstrated such capacities (Belin, Fecteau, et al., 2008; Fritz et al., 2018;

Scheumann et al., 2014, 2017). However, research investigating the identification of affects using semantic labelling has shown the ability of human adults and infants to recognize various affective contents in macaque vocalizations (Linnankoski et al., 1994). Our findings suggest an intermediate level in which humans, despite overall poor performances are yet able to identify affective cues in macaque calls.

Finally, the relationships between participants’ accuracy and acoustic distances (Figure 45) revealed that acoustic feature similarities facilitated both the categorization and discrimination of affiliative and distressful calls in all primate species. These findings highlight the key role of acoustic proximity in cross-taxa recognition. On the contrary, higher acoustic distances were associated with better performances for threatening human and chimpanzee vocalizations. This last result seems a priori counterintuitive, yet, literature has shown that human infant and chimpanzee vocalizations with agonistic contents lead to similar brain responses in human participants within a novelty oddball paradigm (Scheumann et al., 2017). If threatening contents in human and chimpanzee calls can involve novelty responses leading to attentional capture, it is possible that due to higher acoustical distances, our stimuli may have been more acoustically salient and thus resulted in better accuracy.

Our results suggest that both phylogenetic and acoustic proximities are primordial to the correct categorization and discrimination of affect in primate calls. But do the same

mechanisms exist in brain regions often linked exclusively to conspecific vocalizations?

Study 4 indeed focused on TVA usually associated with the human listening of human voice.

Nevertheless and importantly, fMRI wholebrain analyses revealed an increase in activity in the left aSTG for the human categorization of chimpanzee screams as well as an increase of activity in bilateral STC for the recognition of human and chimpanzee vocalizations compared to bonobo and macaque calls (Figure 52). Following this, functional connectivity demonstrated the existence of a similar coupling between left mMTG and right mSTS for the perception of human and chimpanzee vocalizations (Figure 53). To the exception of bonobo calls, for which no previous brain data exist in the literature, these results are in line with recent fMRI findings showing a gradient of activity in bilateral STS for the human perception of primate vocalizations with the strongest neural responses for human voices compared to chimpanzees and then compared to macaque calls in which the lowest activations were found (Fritz et al., 2018). In addition, in line with Study 2 and Study 3, our findings highlight the influence of evolutionary and acoustic divergences of the bonobo species on the recognition of their calls by human participants.

Interestingly, further analyses investigated the link between reaction times and acoustic distances of the vocalizations, and showed that human participants were faster to recognize macaque calls when their acoustic features were the most dissimilar to the human voice (Figure 51). This last finding could be explained by an oddball effect induced by the human recognition of macaque vocalizations with infrequent acoustic features. In fact, previous findings in EEG have revealed P3a ERP for the human perception of macaque screams emphasizing the involuntary attention switch from familiar to novel stimuli (Scheumann et al., 2017).

Hence, results in Study 4 confirm that both phylogenetic and acoustic proximity are needed to recruit TVA in the human brain. From this assessment, analogous mechanisms should be at play in the IFG, a region-of-interest for perceptual decision-making processing.

Using the same paradigm as Study 4, Study 5 revealed increased activity of the pars triangularis, opercularis and orbitalis of bilateral IFG for the categorization of chimpanzee calls, independently of correct/incorrect recognition of the chimpanzee calls (Figures 57 and 59). Moreover, wholebrain analyses showed the involvement of bilateral IFG in the processing of bonobo and macaque vocalizations (Figures 58 and 59). Yet, bonobo calls did

activation in the left IFG for the human discrimination of emotions in human voices compared to chimpanzee vocalizations and then compared to macaque calls. Interestingly, our analyses demonstrated the involvement of most subparts of IFG (left IFGorb excepted) for the correct categorization of macaque vocalizations. On the contrary, activations in bilateral IFGtri only were found for the correct recognition of bonobo screams. The subparts of IFG seem thus differently implicated in the processing of heterospecific vocalizations. In agreement with the fNIRS data shown in Study 2, IFGtri would be indeed particularly involved in such processes. Finally, following the behavioural results revealed in Studies 2, 3 and 4, participants were able to identify all NHP vocalizations, to the exception of bonobos (Figure 61), again emphasizing the possible evolutionary divergence (Hare et al., 2012; Staes et al., 2018) as well as the acoustic and behavioural differences (Grawunder et al., 2018;

Gruber & Clay, 2016) of this peculiar great ape species.

Therefore, human participants seem capable to recognize most primate species; however, a phylogenetic, acoustic and behavioural proximity seems required to enhance activity in all subparts of bilateral IFG.

Overall, Studies 1 to 5 investigated perceptual decision-making in humans using primate vocalizations (human, chimpanzee, bonobo and macaque). Our findings suggest that, relying on different levels of complexity, distinct behavioural and cerebral mechanisms are involved in categorization and discrimination tasks. Interestingly, we found that such behavioural and brain processes were also at play for the affective recognition of NHP species.

Finally, Study 1 and Study 2, demonstrated the suitability of fNIRS to explore using complex paradigms, cerebral processing in healthy humans. Yet, it remained unclear if such relatively new device could be used to assess non-invasively brain mechanisms in NHP. For this matter, Study 6 investigated using a wireless and portative fNIRS, hemispheric lateralization in baboons, as a proof of concept. Here, hemodynamic response analyses revealed contralateral activations for motor stimulations with the right arm movements activating more the left hemisphere and left arm movements enhancing activity in the right hemisphere (Figure 64). These results are in agreement with previous studies in primates showing that 90% of the corticospinal pathway project to the contralateral spinal cord (Brösamle & Schwab, 1997; Dum & Strick, 1996; Heming et al., 2019; Lacroix et al., 2004;

were found for the baboon Chet only with additionally, a right hemisphere bias for the sounds broadcasted in stereo (Figure 65). The lack of results in the two other baboons could be explain by their handedness in communicative gesture. In fact, our left-handed subject Talma, showed a clear right hemisphere bias for most of the stimuli whereas right-handed Rubis showed a stronger bias toward the left hemisphere for the sounds broadcasted in right and left ears. Yet, findings for stereo stimuli are consistent with the literature on human and NHP auditory pathway showing a rightward asymmetry for stereo sounds (Jäncke et al., 2002; Joly et al., 2012; Kaiser et al., 2000; Petkov et al., 2008).

Overall, this last study upholds the suitability of fNIRS to assess brain mechanisms in NHP, and opens new avenues of research in free-ranging primates.

Findings from Study 1 to Study 6 will hopefully contribute to a better understanding of i) the human recognition of affects in heterospecific vocalizations; ii) the evolutionary continuity between human and NHP brains; and iii) the development of non-invasive protocols in animal research.