• Aucun résultat trouvé

Pilote-Automation Interactions and Cooperation in highly automated Cockpits

N/A
N/A
Protected

Academic year: 2021

Partager "Pilote-Automation Interactions and Cooperation in highly automated Cockpits"

Copied!
6
0
0

Texte intégral

(1)

HAL Id: hal-02085142

https://hal.archives-ouvertes.fr/hal-02085142

Submitted on 30 Mar 2019

HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

Pilote-Automation Interactions and Cooperation in highly automated Cockpits

Marielle Plat-Robain, Janine Rogalski, René Amalberti

To cite this version:

Marielle Plat-Robain, Janine Rogalski, René Amalberti. Pilote-Automation Interactions and Cooper- ation in highly automated Cockpits. HCI AERO 98, 1998, MONTREAL, Canada. �hal-02085142�

(2)

Pilote-Automation Interactions and Cooperation in highly automated Cockpits .

Marielle Plat*, Janine Rogalski*, René Amalberti#

* Laboratoire Cognition &

Activités Finalisées CNRS-Université Paris 8

2 rue de la liberté, F-93526 Saint-Denis Cedex2.

mplat@univ-paris8.fr

* Laboratoire Cognition &

Activités Finalisées CNRS-Université Paris 8

2 rue de la liberté, F-93526 Saint-Denis Cedex2.

rogalskij@univ-Paris8.fr

# IMASSA-CERMA. Base d'essais en vol. Département

Sciences Cognitives et Ergonomie Aérospatiale.

F-91228 Brétigny-sur-Orge.

e-mail : rene-a@imaginet.fr

ABSTRACT

Two studies are presented concerning cooperation in highly automated cockpits. In the first one, we analysed how instructors manage two levels of co-ordination: interactions between pilots and automated systems, and communication between pilots, as reflecting their models about teams during training for qualification. In the second study, we analysed how expert pilots manage atypical faults, such as dysfunctions of automated systems. Our hypothesis is that disturbances affecting co-operation are revealing the difficulty the team is feeling when managing the situation by manual intervention. Results show that co-operation is subordinated to situation mastery: when pilots do not sufficiently get control of the objects of their action, they consider as a secondary task communication aiming at sharing mental models and allocating tasks. These results may be interpreted in a model of cognitive compromise (Amalberti, 1996) in managing internal resources and pursuing goals at various levels.

Keywords

human-automation interaction, co-operation, situation awareness, crew resource management, training.

INTRODUCTION

Sarter and Woods have demonstrated the importance of pilot-automation interaction problems in glass-cockpit generation. They emphazised the place of « mode errors » (Woods & Sarter, 1992), the complexity of knowledge acquisition on automated systems and the difficult transition of theoretical knowledge (out of context) into operational knowledge. They identified new knowledge requirements as difficulty to recognize mode changes and operations required to produce those changes (Sarter & Woods, 1994;

Sarter & Woods, 1995b), which are important when you know that situation awareness is depending on the active mode awareness in time. Wiener (1985) and Amalberti &

Valot (1987) had already written that bad situation awareness is related to the to simple comprehension of automatisms and the specific way they have to operate.

Sarter and Woods demonstrated the pilot-automation interactions in highly automated system differences due to the self operative level of the system which must be in a way considerate as a free agent or as the third agent of the cockpit (Sarter & Woods, 1995a). In such a way, we must considerate interactions in highly automated cockpits as a double one : on the one hand human interactions between pilots, on the other hand interactions with those particular agents (automatisms). Considering that the cockpit architecture is not helping pilots cooperation by the independancy of each interface : pilots need to stop themselves going into the easiness of self acting and communicate with their team-mate as explaining what they are doing or as helping (Pelegrin & Amalberti, 1993).

Moreover, pilots believe in their own fault — collective nor individual — before considering a possible automatism fault, even when they are partially aware of which part of automatisms are acting (often not good belief) because of the complexe and hide operating way. As a way of consequence this produce hesitations to come back to manual, difficulties to take decisions and by the way serious incidents. Those incidents which could at least, even if this is exceptional, become accidents (Sarter & Woods, 1992;

Amalberti, 1994; on press).

Those problems are not related to a special airplane nor a specific builder but are simply following a more global evolution, where all aeronautic environment is quickly evolving by a complex computerized process to access to an economic need of high performance (Abbot & al, 1996).

System is mainly secure, computerized automation had reduced a lot the risk source and made the security limit move to new residual points of difficulty which have been describe in this paper by the difficulties of first line agent to act with automatisms.

Industry (by a large view including builders, companies, autorities) is not ignoring those residual sensible points.

Decisions had been taken to contribute to a best team awareness about airplane automatism’s evolution.

(3)

Incident/accident reporting system is been taking into account by conception and settlement in the idea of creating better interface in future airplanes. For now, about actual flying airplanes, talking a lot between pilots, proceeding to annonces for every automatism changes and exchanging all points of view about on line situations and world states in the cockpit is required. In consequence, instructors have been asked to teach pilots this new cooperative way to communicate during professionnal training.

Another obligation requesting is to go back to manual as often they have a doubt about they operational understanding of automatisms work.

Two experimentations are evaluating in the following paper the reality of how those two philosophies are applied in a training cockpit (communicating by talk and going back to manual as soon a s pilots don’t understand what’s going on); Beyond those pragmatic results, analyses contribute to create a cognitive model of interaction in the cockpit.

The first experiment concern instructors interventions about oral communication and cooperation in high automated training cockpits during fault training sessions. Teams are in qualification training, learning how to fly with this airplane.

Instructor interventions are taking as revealing team difficulties

The second experiment is an analyse of team reactions to non standard faults in an extend professional training frame with already professional qualified on this airplane. Thoses two experiments had been done on a full flight simulator.

INSTRUCTOR AS REVEALING TEAM DIFFICULTIES The first experiment is focused on team which are learning new high automated airplane on simulator (type qualification). During this training one instructor is on board to correct eventual errors.

This experiment was published on the structure of communication within the cockpit (Rogalski 1996). Results presented here have not been published -Unless a student report (Plat, 1996)- and are different by the focus on instructor interventions as revealing team difficulties.

Introduction :

The analyse developped here is a part of an experiment about competence during type qualification training on a high automated airplanes. Data are instructors and pilots activities during three full flight simulator sessions (middle and end of qualification). Video recorded data had been filmed during usual aeronautic training: Important incident in critic phase (fire on the engine nacelle during take off), instrumented approach (VOR-DME), usual disruption at

landing (go-around). The video record had been done by three video cameras (two of them at each cabin side and another one at the back of the flight deck which is recorded the dashboard). The transcription is on the one hand oral communication (instructor and each pilot) and on the other hand, gestures, actions and pilot glances according to functional cockpit zones (Rogalski, Samurcay and Amalberti, 1994). previous results about pilots activity demonstrate that the quality of cooperation in the cockpit is strongly depending of technical knowledge integration.

They show that teams are correctly operating required procedures , even if information sharing is often implicit and not oral as expected. We could state a real economic process in those call-outs, as soon as everything is ok in the cockpit : they are partly replaced by non verbal communications (glances, gestures, actions) which are consistent with usual schemas (Rogalski, 1996).

This experiment has also shown that those communication between pilots - for shared situation awareness- were more present when technical procedures where correctly memorize, and that instructor intervened firstly on system mastery before they intervening on the cooperative process within the cockpit. We could stand, during the explicit sharing (oral communication), which could permit the situation representation to become up to date, that actions and call-outs which are concerning this are mostly present, to the prejudice of informations about system states. We are interested here by instructor activities, and more specifically to interventions about pilot-automatism interactions, and of course pilots interactions.

Methodology

Instructor interventions were analysed for each of the three sessions which are quite similar during full flight simulator training (Plat, 1996). They are mainly focalised on two interesting phases (takeoff and approach) considered as more problematical. Detailled data about pilots activity allowed us to identify the theme of each intervention. We separate, on one hand, pilot-automatism interaction interventions in comparison to other technical components, and in the other hand, cooperation based interventions.

Interventions are organised as positive when they are reinforcing team actions, teaching new knowledge. They are categorized when they are correcting team actions. We specify the time of the action, as being before action (anticipation or prevention), during the action execution or on the action effects.

Results :

• Interventions are mainly focused on approach phase and concerned automatism piloting points : taken as a whole, more than half interventions are about automated actions during approach phase.

(4)

• Intervention place related to team action is not the same depending on which technical system is active:

intervention about non automatic piloting actions are mostly present during the team action, compared to automated piloting actions which are mainly before or after action phase. Those interventions are often about operational knowledge acquisitions in an informative way which are considerate as greatfull for a better automatisms use by instructors.

• Instructor interventions about team cooperation are marginal, even if information or action declarations, and task distributions are also knowledge which need to be conditionalize to context during simulator training.

• Instructor interventions about anticipating and controlling automatism states are limited by the proper action. Those typical actions could contribute to the mode awareness which is quite implicit in the cockpit.

• Concerning training intervention evolutions they are decreasing according to team acquisitions. Negative interventions are reducing, they are marginal during the last session. Positive interventions are showing a very large team variability.

• Triad differences (instructor and two pilots team) are very important. Mainly interventions are during action and considered mostly as positive interventions. But they could also be before and after action with a preference for negative interventions in that case.

Instructor interventions during qualification training are showing a specificity when automatisms are concerned.

They are centralised on technical aspects of pilot- automatism interactions; cooperation interactions still be marginal. Everything is like instructor actions are focalised on explicit difficulties about how operating procedures, and are evaluated performance in team cockpit activities, without explicitely taking care to the cooperative process place. It is like we could have a mirror picture of team operating way in himself. This way of functionning had been analysed in indexed situations, in with precise procedures are corresponding. What about automatism dysfunction situations, where a simple instruction to go back to manual is required? this question is the second experiment subject.

COOPERATION DURING AUTOMATED SYSTEM DYSFUNCTIONS

Introduction :

This experiment is in glass-cockpit full flight simulator during Line Oriented Flight Training (LOFT). Surprising faults have been invented for this LOFT, they concern the automatism which is managing flight (Flight Control Unit) and are mostly unknown by pilots. Those faults are

"computer bugs" create for this experiment. They needed to

transform simulator in programming by double this automatism functions which had permitted to switch from normal operational functions, to programmed faults and versus.

Ten pilot teams from an Europeenne aeronautic company had been random choosen by the company planning departement as it used to be done for all real trip and simulator session. (Team members qualification high automated airplanes are one to six years for captains and six months to five years for first officers). Each session was a briefing (one hour) and a LOFT (2H20mn, international trip). Air flight controller was on the LOFT. Teams were aware to the non conventional aspects of faults, with Flight Control Unit automatism dysfunctions ( This kind of information is used to be done for each simulator session ).

Two different scenarios has been used. The first is composed of seven successive faults, five are "bugs", which could be translated by "many things to think" workload and high stress level. The second scenario is made up of six faults with five more standardized faults (known by teams) but which need time and heavy workload,the other fault is one of the automatism dysfunction fault contained in the first scenario. We qualified the workload of this scenario by

"many things to do".

In front of automatism dysfunctions, it is required that pilots go back to manual to stay on a determined base of procedures which is easier to manage. Check-lists and cross-check, belong to prescribed task : Then, it is required that team-mate call-out every action by oral before they act ( the only action which is not according to this requirement is when the Ground Protector system is warning : they act and talk at the same time).

Methodology :

Joined by a video camera and a phone, instructor in the simulator and two experimenters (one pilot, one ergonomist) in an annexe room are controlling the good processing of each session. Camera is video recording two screens which permit to check Flight Control Unit informations. This help to locate precisely time when faults appear and mode action changes operate by team. The crew verbal exchanges were recorded by an extra microphone.

Audio and video records were transcripted into terms of state systems, actions and cockpit verbalization data.

First, we will analyse the « call-out », which have been taken as revealing crew difficulties revelator in managing the situation : strong perturbation must be related to less verbalization. Analyses began by a count for each session to automated related action call-outs, as they could be done "as required" (call-out then action) or not (action then call-out,

(5)

or action without any call-out). Then we will study the cooperation modes (Rogalski, 1994) in the cockpit, during damaged automated situations. The two previous scenarios were compared to evaluate cognitive workload type role.

• Collaboration : problem solving and decision choice are collective. A common referential to active activities and a common solution is build up by the crew .

• Distributed cooperation : Task is decomposed into distributed sub-tasks, for better efficiency. A synchron sub- tasks managing strategy is created.

Results

It's particularly important to note that flight security was never endangered and that every crew had ended the mission safely, without any major problem. Nevertheless, from global point of view, crews didn't choose to follow the requesting suggestions immediatly ("go back to manual").

As a matter of fact, most of them tried to understand fault by the current a series of tests and after all tried to restore a normal situation by searching breakers. This is only when they felt that situation began to be no more manageable within time contraints, that they choose to go back to manual (Plat & Amalberti, 1997).

Action call-outs on automatisms

• In the fisrt scenario , most of the call-outs were conform to the expected with a relative importance to non- conform call-outs: 15% without call-out, 10% postponed call-outs.

• In the second scenario, non-conform call-outs are dominant : 42% of actions without call-outs, and 26%

postponed call-outs ; only a third part of call-outs have been done as company required.

Cooperation modes and decision choices

• In the first scenario, cooperation is mainly distributed cooperation (52,6%), and then collaboration (44,7%). In the second scenario, collaboration is dominating (65,2%), followed by distributed cooperation (30,4%).

• Most of decisions were explicited by flying pilots (70%) whatever is their hierarchical position. We have few shared decision cases(10%).

• Conflicts (divergent point of view about goal or subgoal and about how to reach them) were significant for only two teams (on a total of ten), and occured only one time. They appeared within collaborative mode dominant during the whole flight. In those two cases, the Captain wasn't the Flying Pilot, but was the final decision-maker.

The dysfunction of automated systems, even if expected, have disturbed cooperative process, even for experimented pilots. We need to focused on a main fact which is the lack of call-out. The second scenario results have confirmed that load-execution during standardized fault restoration while there is dysfunction is particulary disturbing (more than

40% actions were without call-out). Cooperative modes (collaboration importance), and the rarety of conflicts seems for us as being indicators of a collective activity regulation while managing those dysfunctions.

DISCUSSION & CONCLUSION

Results are showing up that explicit cooperation is a process subordinated to situation mastery : When pilots don't have a sufficient mastery of their object actions, they will treat as subordinated call-outs which contribute to build the common representation of situation.

On the one hand , industry must be deceived by this result;

because is attempted to proove that prescribed solutions to solve communication problems and sharing representations in high automated cockpits (speaking, go back to manual) are themselves sensitive to teams difficulties. When this difficulty is growing, precisely because it is hard to understand or to control the situation, pilots have a tendancy to reduce call-out occurences, to respect communication procedures at a lower level, and to stay with high active automated levels (perhaps because they wanted to avoid the workload increase while going back to manual).

Those last cases illustrated that pilots try to understand situation as they couldn't renounce to the workload reduction (action executions) bring up by automatisms, as they are not sure about the needing of this precise solution (as the only one). But the non inhibition of what we should qualified as an "understanding scheme" is producing an increase of cognitive load. This last one is translated by a decrease of communication process (call-outs). In fact different goals are proceeded : rebuild of situation representation, system processing with a garanty against external risk, "internal" risk limitation created the lack of understanding, representation and action shared spaces maintained into team. Cognitive compromise in usable resources and different goals managed by team (Amalberti, 1996) seems to operate by subordination to the implicit cooperation.

In fact, during builders training about "glass cockpit"

airplane qualifications, instructor interventions are not focused on the importance of explicit call-outs even if everything is alright. Interventions are more present in a corrective way when they could discriminate disparity between attempted performance. Actual data in the second experiment show up the explicit cooperation process vulnerability (to call-outs) in case of automatism dysfunction. This is more expressive that the operative workload is heavy (second scenario). Nevertheless those data are not sufficient to describe the nature effect about the possible loneliness of each team-mate and situation representation differences which could produce debates, conflits, therefore some delay in task performances (Wiener

(6)

1989). However, we could remark here the fact that the major decision maker during the whole flight when he is not the flying pilot is strongly associate to conflit as a helping factor.

Moreover, to explicit automatism state changes is helping for the active mode awareness and call-outs are very usefull to build up collective situation awareness. As a matter of fact those two specific helping point have been noticed as weak in pilots activity both in qualification training and expert teams in high disturbed situations. The fact that knowledge are not teaching in correspondance with the context during full flight simulator training sessions is contributing to keep it as theorical knowledge which are

« inert knowledge » in the sense of Sarter and Woods; those are moreover less available when action difficulties are strenght - Even then those knowledge will be productive in such situations.

We already show up that instructor intervention points like

« automatism states » which are usefull to get one's bearings into automatism activity loop for teams wasn’t an effective priority of the full flight simulator training interventions.

Nevertheless, in the second experiment we noticed (Plat, 1997; Plat & Amalberti, 1997) that teams will try to localize the dysfunction source, that they will often try to restore automatism functions, and that even if they don’t have any guide to help this kind of problem to be solved and the requirement to go back to manual, there is no specific training to this kind of incident. It could be very usefull to focused more interventions on possible localisations about automatism states which could permit then a better dysfunction source consciousness and their potential unreclaimability.

Acknowledgments.

Analysed data are coming for first experiment from a contract between DGAC, IMASSA ,CNRS and the University Paris 8 about cooperative process in highly automated cockpits. For the second experiments , data are coming from a DGAC experiment about pilot reactions to automated system dysfunctions . We acknowledge companies, pilots and instructors for their cooperation. The present papers only engages its authors.

REFERENCES

1. Abbott K., Stimson D., Slotte S., Bollin G., Hecht S., Imrich T., Lalley R., Lyddane G., Thiel G., Amalberti R., Fabre F., Newman T. (1996). The interfaces between flighcrews and modern flight deck systems, Report of the FAA HF team, June 1996, FAA : Washington DC.

2. Amalberti, R., Valot C. (1987). Erreurs humaines dans certains accidents aériens et perspectives d'"aides intelligentes". Rev. Med. Aéro. & Spatiale, (26) 104,

319-320, 1987.

3. Amalberti, R. & de Courville, B. (1994). Cockpit Automation, In (Ed) Amalberti R. Briefings: a human factor course for professionnal pilots, IFSA: 145-158.

4. Amalberti, R. (1996). La conduite des systèmes à risque.

Paris : PUF.

5. Amalberti, R. (on press). Cockpit Automation : A human factors perspective, in (Eds) D.Garland, J. Wise

& D. Hopkin Aviation Human Factors, L. Erbaum Associates : Hillsdale, NJ.

6. Pelegrin, C. & Amalberti, R. (1993). Pilot's strategies of crew coordination in advanced glass-cockpits; a matter of expertise and culture. Circulaire OACI 217-AN/12.

7. Plat, M. (1996). Les interactions pilotes/automatismes en cours de formation dans les cockpits d'avion A320.

Mémoire de maîtrise. Université Paris 8.

8. Plat, M. & Amalberti, R. (on press). Crew training to automation surprises, In Cognitive Engineering in the Aviation Domain, Sarter N. & Amalberti R. (Eds) Hillsdale : LEA.

9. Rogalski, J. (1994). Formation aux activités collectives.

Le Travail humain, 57(4), 367-386.

10. Rogalski, J. (1996). Co-operation processes in dynamic environment management: Evolution through training experienced pilots in flying a highly automated aircraft.

Acta Psychologica, 91 , 273-295.

11. Sarter, N. B. & Woods, D. D. (1995)a. "Strong, silent, and 'out of the loop' ": Properties of advanced (cockpit) automation and their impact on human-automation interaction. No 95-TR-01). CSEL.

12. Sarter, N. B. & Woods, D. D. (1995)b. How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Human Factors, 37 (1), 5-19.

13. Sarter, N.B. & Woods, D.D. (1994) Pilot interaction with cockpit automation II: An experimental study of pilots' model and awareness of the Flight Management and Guidance System. International Journal of Aviation Psychology, 4(1), 1-28.

14. Wiener, E. (1985) Beyond the sterile cockpit, Human factors, 27 (1), 75-90.

15. Wiener, E. (1989). Human factors of advanced technolgy ("glasscockpit"). Transport Aircraft, NASA Contractor Report 177528.

16. Woods, N. B. & Sarter, D. D. (1992). Mode error in supervisory control of automated systems. Human Factors Society 36th Annual Meeting . Atlanta, GA.

Références

Documents relatifs

The characteristics of author teams that we consider are gender, nationality, seniority, academic rank, and team size; and we examine their impacts on publication

Therefore, the player configures CogAgent with information on the training task, measured variables, student profile, assessed constructs and existing

These ‘biting cleaners’, compared with normal cleaners, specifically targeted larger non-predatory clients, both residents (median client jolt rate was 12/100 seconds in

25 This higher cooperation by seniors seems a strong result because it suggests that their work experience did not lead them to become more selfish, but instead that experience

Two-way communication is thus cooperative in the sense of involving mutual cognitive consideration and the goal of mutually shared understanding of a minimum of

Hence- forward, the development of automated testing continued and the creators of fuzzing used this method to search for vulnerabilities in software using different

To realize productive and safe text input interfaces for highly automated driving will be a challenging task considering the added requirements compared to conventional

Our second main result in Chapter 2 asserts that, when sidepayments are allowed and private information is ex-post verifiable 12 (so that incentive constraints are unnecessary),