• Aucun résultat trouvé

Decoding Visual Attentional State using EEG-based BCI

N/A
N/A
Protected

Academic year: 2021

Partager "Decoding Visual Attentional State using EEG-based BCI"

Copied!
2
0
0

Texte intégral

(1)

HAL Id: hal-01934356

https://hal.archives-ouvertes.fr/hal-01934356 Submitted on 26 Nov 2018

HAL is a multi-disciplinary open access

archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

Decoding Visual Attentional State using EEG-based BCI

Soheil Borhani, Reza Abiri, Sara Esfahani, Justin Kilmarx, Yang Jiang, Xiaopeng Zhao

To cite this version:

(2)

Behavioral response

Brain response

References

Introduction

Participants

EEG recording

Experimental protocol

This work was in part supported by NeuroNet and Alzheimer’s Tennessee.

Decoding Visual Attentional State using EEG-based BCI

1

Department of Mechanical, Aerospace, and Biomedical Engineering, University of Tennessee, Knoxville, USA,

2

Department of Neurology, University of California, San Francisco/Berkeley, CA, USA,

3

Department of Psychology, University of Tennessee, Knoxville, USA,

4

Department of Mechanical Engineering, University of Texas, Austin, USA

Soheil Borhani

1

, Reza Abiri

2

, Sara Parvanezadeh Esfahani

3

, Justin Kilmarx

4

, Xiaopeng Zhao

1

1000 ms 1000 ~ 1500 ms

Block 1:

1000 ms 1000 ~ 1500 ms

Block 8:

.

. .

.

. .

Phase 1: Image recognition

1000 ms

Block 1:

1000 ms

Block 8:

.

. .

.

. .

Phase 2: Attention evaluation

One crucial factor to human cognition and perception is attention, the ability to facilitate processing perceptually salient information while blocking the irrelevant information to an ongoing task. Sustained attention, also known as vigilance, refers to the capability of maintaining focus to a task over a prolonged period. Visual attention is a complex phenomenon of searching for a target while filtering out competing stimuli.

In the present study, we analyzed brainwave patterns during sustained attention in a participant. Scalp electroencephalography (EEG) signals using a wireless headset were collected in real time during a visual attention task. This is considered a preliminary study to design a neurofeedback working memory and visual attention boosting setup.

Thirty-eight college students (11 females; 21.3±1.9 years and 27 males; 23.1±5.2 years) participated in the experiment.

They all had normal or corrected to normal vision. They had no known history of neurological or psychological disorder. Five

of the participants were left-handed and 33 were right-handed.

EEG Data was acquired using a water-hydrated 14-channel Emotiv EPOC wireless EEG

headset over AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4 according to 10-20

standard with a sampling rate of 128Hz. The electrode-scalp impedance was kept below 10KΩ

for all electrodes. The received signals were referenced with respect to P3/P4 electrodes over left and right mastoids.

Participants were exposed to two types of stimuli in two separate phases.

In phase 1 (Image recognition), participants were shown a sequence of single images fairly selected from two categories:

Face and Scene categories while they were asked to discriminate between subcategories by keyboard button press.

In phase 2 (Attention evaluation), participants were shown a sequence of overlaid images fairly selected from two

categories; Face and Scene while they were asked to discriminate between subcategories by keyboard button presses.

[1] Borhani, S., Abiri, R., Muhammad, J.I., Jiang Y. , and Zhao X., "EEG-based Visual Attentional State Decoding Using Convolutional Neural Network," presented at the 7th International BCI Meeting, Pacific Grove, CA, United States, 2018-05-21, 2018. Available: https://hal.archives-ouvertes.fr/hal-01843916

[2] Jiang, Y., Abiri, R., & Zhao, X. (2017). Tuning up the old brain with new tricks: attention training via neurofeedback. Frontiers in aging neuroscience, 9, 52.

[3] deBettencourt, M., Cohen, J. D., Lee, R. F., Norman, K. A., & Turk-Browne, N. B. (2015). Closed-loop training of attention with real-time brain imaging. Nature neuroscience, 18(3), 470.

Individual Data Analysis Pipeline:

High-pass filter (Cut-off frequency

= 0.5 Hz) Remove electrical grid noise Reject noisy channels Artifact Subspace Reconstruction Interpolate rejected channels

Common Average Reference (CAR)

Independent Component Analysis

PCA reduce to data rank

Fit current equivalent dipoles (DIPFIT2)

Discriminate neural vs. non-neural ICs

Single IC Time-Frequency decomposition

Image recognition Phase

Attention evaluation phase

95.7 [2.7] %

88.1 [5.4] % 547 [69] ms

633 [53] ms

Phase 1: Image recognition

667 [73] ms

706 [62] ms

Phase 2: Attention evaluation

Time-frequency analysis was performed using EEGLAB toolbox. A tapered moving Hanning window with a short-time Fourier

transform extracted the time-frequency of epoched data over all trials. We used ERSP over grouped data to measure

fluctuations of component power in the frequency band of [5-45] Hz. A Morlet wavelet was applied with a linearly increasing

cycle of 1 at 5Hz and 7 at 45Hz. Line-spaced frequencies ranging from 5Hz to 45Hz were incorporated for ERSP and ITC. To

obtain ERSP, we averaged the spectral power across all trials of “Faces” and “Scenes” stimuli, separately. Then, the calculated

spectral power converted to log power for better illustration. We considered a pre-stimulus period of [-100, 0] ms to calculate

the event-related desynchronization (ERD) and event-related synchronization (ERS) for both phase 1 and phase 2.

Average ERSP of “Face” trials Average ERSP of “Scenes” trials ITC

Estimated current dipoles from each participating independent component

over right fusiform gyrus

Average ERSP of “Face” trials Average ERSP of “Scenes” trials ITC

Estimated current dipoles from each participating independent component

over right fusiform gyrus

Phase

1

Références

Documents relatifs

Le mémoire a donc été orienté vers ne revue et une synthèse des problèmes de base en segmentation d’image et de leurs principales formulations mathéma tiques : filtage

Cet article présente une synthèse de quatre études menées en France auprès de distributeurs et de consommateurs de produits biologiques, visant à cerner à la fois les pratiques

Pions, kaons and protons are identified over a wide momentum range by combining the information extracted from the specific ionization energy-loss (dE/dx) mea- sured in the

Although sub-hourly time discretization is currently available in several whole building energy simulation programs, it remains challenging to model this level of complexity as

In contrast to sustainable development, however, there is a common understanding of specific underlying themes orbiting around energy transition: “The concept of ‘energy

Rausser, Handbook of Agricultural Economics: agricultural production, Volume 1A, Holland: Elsevier Science B.V, 2001.. David Stadelmann, La fonction de production

Abstract—This paper describes the results of the practical measurements done to determine the path delay associated with each bit of a hardware AES FPGA implementation using a

This paper presents the description and experimental results of a versatile Visual Marker based Multi-Sensor Fusion State Estimation that allows to combine a variable optional number