• Aucun résultat trouvé

Pilot’s head tracking as a proxy for gaze tracking

N/A
N/A
Protected

Academic year: 2021

Partager "Pilot’s head tracking as a proxy for gaze tracking"

Copied!
2
0
0

Texte intégral

(1)

ISAE-SUPAERO Conference paper

The 1st International Conference on Cognitive Aircraft

Systems – ICCAS

March 18-19, 2020

https://events.isae-supaero.fr/event/2

Scientific Committee

Mickaël Causse, ISAE-SUPAERO

Caroline Chanel, ISAE-SUPAERO

Jean-Charles Chaudemar, ISAE-SUPAERO

Stéphane Durand, Dassault Aviation

Bruno Patin, Dassault Aviation

Nicolas Devaux, Dassault Aviation

Jean-Louis Gueneau, Dassault Aviation

Claudine Mélan, Université Toulouse Jean-Jaurès

Jean-Paul Imbert, ENAC

Permanent link :

https://doi.org/10.34849/cfsb-t270

Rights / License:

(2)

ICCAS 2020 Pilot’s head tracking as a proxy for …

Pilot’s head tracking as a proxy for gaze tracking

Content

Eye tracking is often used in aviation to study the pilot’s visual circuits, which are relevant to the first two stages of eye tracking integration 1. However, the sensors can be expensive or difficult to integrate in a cockpit and can be sensitive to light variations. We propose to start studying the movements of the pilot’s head as a proxy of their visual interest, because the sensors are already integrated in many virtual reality setups and are not so expensive. Of course, the data quality can’t be considered as good, because the pilot can move their eyes without moving their head. Therefore, the goal of this study is to qualify the interest of the pilot’s head tracking.

We collected head movements data in virtual reality flights in a light plane using a two-phases protocol:

1) A calibration phase asking the pilot to look at some specific points 2) A list of simple flight phases (climb, turn…)

The calibration phase is useful because some pilots move their head a lot whereas some other pilots move their eyes more.

The virtual reality headset used, the Oculus Rift, already includes some correction signal processing and outputs the head’s orientation [2].

We then defined some areas of interest (AOIs): • Outside the cockpit, ahead

• Outside, left • Outside, right

• Primary flight instruments • Secondary flight instruments

• Low (e.g. looking at a map on one’s knees)

and we analyzed the flight by slicing it into different flight phases.

We observed a few interesting preliminary results. For instance, the search for a nearby traffic generates broader than usual head movements. In a turn, the pilot turns their head to the side but also moves it downwards because their vision is obstructed by the roof of the plane. During a climb, there are some downwards head movements, towards the flight instruments, but they are a lot smaller than during a turn, seemingly showing that eye movements are mostly sufficient to obtain the necessary information. Finally, experienced pilots seem to move their head less than beginners, maybe because they use more peripheral vision consistently with [3].

Additional data will be necessary to make these results more robust. In future analyses, we also plan to use more formal statistical metrics to describe the pilot’s visual circuits, as well as further signal processing to detect some abnormal pilot states (unconscious pilot, pilot in attentional tun-neling…). We will also further study the differences in head behaviour between experienced and beginner pilots. We will start integrating IMUs on the pilot’s head in real flights as well.

1 Peysakhovich, Vsevolod, et al. “The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety.” Safety (2018)

[2] LaValle, Steven M., et al. “Head tracking for the Oculus Rift.” 2014 IEEE International Confer-ence on Robotics and Automation. IEEE, 2014.

[3] Fox, Julianne, et al. “Information extraction during instrument flight: An evaluation of the validity of the eye-mind hypothesis.” Proceedings of the HFES Annual Meeting. 1996.

Keywords : Eye tracking, EEG, fNIRS, Other measurement methods, Brain computer

Interfaces

Références

Documents relatifs

Eye tracking demonstrator (a) schematic of the overall system depicting the artificial eye (A) on which the cyclops lens is mounted, the servo-motor (F) and associated

Focusing on linguistic guidance in graph comprehension, the tasks analysis method is used for decomposing the tasks specified by sentential descriptions, and then this

In order to assess the validity of the eye tracking method to detect indoor landmarks, the results of the eye movement analysis will be compared with the think aloud meth- od, which

To analyze the effectiveness of the Rhythm-Eye visualization in the context of Process Behavior Analysis we conduced an eye tracking session where we asked two students from

Here, NO - MATH members start to explore the symbols in the mathematical expression near the center, whereas MATH members seem to gather information about the variable name.. “b”

Much of the previous work has focused on the correlation of eye gaze with speech, but it is still an open question of how dual eye tracking can be used to assess the

For the purposes of analysis, the data were categorized by aircraft type (2: Boeing, Airbus), pilot role (2: PF, PM), flight phase (2: final approach, go-around), and the six

To summarize this brief history, even if eye gaze tracking systems exist since a long time, the tracking and measure of eye behaviour and gaze di- rection was until recently a