• Aucun résultat trouvé

6.2 Study 2b

6.2.1.3 Research questions and hypotheses

Various areas of research have linked cognitive processes to emotions (D’Mello and Graesser, 2012; Eastwood et al., 2001; Lerner et al., 2015; Spering et al., 2005). For ex-ample, D’Mello and Graesser (2012) explain how emotions and cognitive processing of information are intertwined in individual complex learning. In their model, spe-cific emotional (or cognitivo-affective) states go hand in hand with spespe-cific cognitive states (flow with equilibrium, confusion with disequilibrium, frustration with stuck, disengagement with boredom). As cognitive reasoning is also conveyed through communicative exchanges in collaboration, a question that can be asked is whether similar findings can be found between emotion sharing and collaborative acts. Fur-thermore, as outlined in the previous section, emotional sharing is not only related to socio-epistemic matters. Literature also shows that socio-emotional matters are

a significant concern in social interaction (Rimé, 2007). In addition, Pekrun and Linnenbrink-Garcia (2012) distinguish several types of emotions occurring in indi-vidual and collective academic settings related to specific focuses of learning such as achievement emotions (related to the achievement of activity and outcomes; e.g., the frustration of not succeeding), epistemic emotions (related to the learner’s cogni-tive processing of information; e.g., the confusion of not understanding a problem) or social emotions (related to the relationship with the others; e.g., the gratitude to-wards a peer). This categorization suggests that emotional sharing could also be re-lated to different collaborative focuses, such as socio-cognitve and socio-relational, that could be shared preferentially in different phases of collaboration (e.g., when problem-solvers make acquaintance or try to find new ideas to solve the problem).

We proposed above that the emitter would use emotions to draw the receiver’s atten-tion regarding important emoatten-tional, motivaatten-tional or cognitive matters, which would be intended to induce perlocutionary effects from the receiver. Consequently, emo-tional expressions would lead the receiver to make inferences about the emitter’s needs, which would induce adaptive effects in return. These adaptive changes are posited to occur in real-time, before emotional sharing for the emitter and after emo-tional sharing for the receiver. Therefore, the first questioning that drives this study is whether emotion sharing modulate collaborative acts (RQ1). First, we assume that real-time changes of specific collaborative acts by the emitter precede specific emo-tional sharing by the emitter (H1a). From an operaemo-tional point of view, specific col-laborative acts should be subject to a significant increase (resp. decrease) preceding specific emotional sharing, compared to when no emotion is shared. For example, if the emitter has a strong positive opinion about a possible solution to solve the task, he/she should be more likely to draw the receiver’s attention by sharing an emotion of interest. Second, we assume that real-time changes of some specific collaborative acts by the receiver follow specific emotional sharing bu the emitter (H1b). For ex-ample, if the emitter shares an emotion of interest, the receiver should adapt his/her collaborative acts accordingly.

A second issue that arises from the literature concerns the relationships between the sharing of some emotions and some patterns of collaboration. Therefore, the second questioning is whether specific patterns of collaboration can be highlighted, considering the triad emitter’s collaborative acts, emotional sharing, and receiver’s collaborative acts and if these patterns occur preferentially in specific collaborative phases (RQ2). We assume that specific triads relate more specifically to dealing with specific cognitive or relational matters (H2a). We also assume that some triads occur preferentially in specific collaboration phases (H2b).

6.2.2 Method

6.2.2.1 Participants

The analysis was performed on data provided from a sample of 22 participants (12 women and 10 men,M= 23.9 years;SD= 7.45) taken from the freely accessible EAT-MINT database (Chanel et al., 2013), regrouping multi-modal and multi-user data of affect and social behaviors recorded during a computer-supported creative problem-solving collaboration (Molinari et al., 2013), such as physiological signals (electrocar-diogram, electrodermal activity, blood volume pulse, respiration, skin temperature), behaviors (eye-movements, facial expressions, software actions logs) and discourse (speech signals and transcripts).

6.2.2.2 Procedure

Eleven same-gender dyads of participants using networked computers were in-volved working together in the DREW software (Jaillon et al., 2002), a collaborative environment that includes an argument graph tool allowing collaborators to build a joint map of their argumentation. Participants could communicate through micro-phone headsets, and their verbal exchanges were recorded. They did not see each other. Participants were asked to use the argument graph tool to create a slogan against violence at school collaboratively (Figure 6.2, left). The group collaboration lasted for about 36 minutes on average. It was divided into three main phases. Par-ticipants should spend 2-fifths of the time in phase 1 where they should produce as many slogans ideas as possible, 2-fifths of the time in phase 2 where they should debate with each other and agree on three slogans, and 1-fifth of the time in the last phase where they should choose the best slogan.

Dyad members were also provided with a tool allowing them to share in real-time verbal labels of their emotions through an emotion awareness tool (Figure 6.2, right).

They could choose among 10 positive (delighted, focused, interested, satisfied, em-pathic, confident, amused, relaxed, grateful and relieved) and 10 negative (stressed, annoyed, surprised, disappointed, envious, anxious, dissatisfied, confused, frus-trated, bored) emotions by clicking on them. The emotions available in the emotion awareness tool were chosen based on a survey carried out on 59 participants re-garding the most frequent emotions experienced during a collaborative task among the 36 emotions of the Geneva Emotion Wheel (Sacharin et al., 2012). Once partic-ipants selected an emotion, it was automatically displayed to them (green area in Figure 6.2) as well as their partner (blue area in Figure 6.2). Participants were in-structed that they were free to self-report their emotions at any time they wanted during collaboration (for a complete description, see Molinari et al., 2013). In ad-dition, they were prompted with a pop-up window to share their emotions at the beginning of the interaction and every 5 minutes during the collaboration.

FIGURE6.2: The Drew interface coupled with an emotion awareness tool for sharing verbal labels of emotions in real-time during the collaborative problem-solving task.