4.6 Article

Integrating neural and ocular attention reorienting signals in virtual reality

Journal

JOURNAL OF NEURAL ENGINEERING
Volume 18, Issue 6, Pages -

Publisher

IOP Publishing Ltd
DOI: 10.1088/1741-2552/ac4593

Keywords

attention; eyetracking; EEG; BCI

Funding

  1. National Science Foundation [IIS-1816363]
  2. Army Research Laboratory Cooperative Agreement [W911NF-10-2-0022]
  3. US Department of Defense [N00014-20-1-2027]

Ask authors/readers for more resources

This study utilized virtual reality (VR) technology to investigate the relationship between gaze and attention reorienting, finding that gaze dwell time contributed most significantly to reorienting signals. By integrating EEG, pupil, and dwell time features, a hybrid classifier successfully detected reorienting signals in both fixed and free conditions.
Objective. Reorienting is central to how humans direct attention to different stimuli in their environment. Previous studies typically employ well-controlled paradigms with limited eye and head movements to study the neural and physiological processes underlying attention reorienting. Here, we aim to better understand the relationship between gaze and attention reorienting using a naturalistic virtual reality (VR)-based target detection paradigm. Approach. Subjects were navigated through a city and instructed to count the number of targets that appeared on the street. Subjects performed the task in a fixed condition with no head movement and in a free condition where head movements were allowed. Electroencephalography (EEG), gaze and pupil data were collected. To investigate how neural and physiological reorienting signals are distributed across different gaze events, we used hierarchical discriminant component analysis (HDCA) to identify EEG and pupil-based discriminating components. Mixed-effects general linear models (GLM) were used to determine the correlation between these discriminating components and the different gaze events time. HDCA was also used to combine EEG, pupil and dwell time signals to classify reorienting events. Main results. In both EEG and pupil, dwell time contributes most significantly to the reorienting signals. However, when dwell times were orthogonalized against other gaze events, the distributions of the reorienting signals were different across the two modalities, with EEG reorienting signals leading that of the pupil reorienting signals. We also found that the hybrid classifier that integrates EEG, pupil and dwell time features detects the reorienting signals in both the fixed (AUC = 0.79) and the free (AUC = 0.77) condition. Significance. We show that the neural and ocular reorienting signals are distributed differently across gaze events when a subject is immersed in VR, but nevertheless can be captured and integrated to classify target vs. distractor objects to which the human subject orients.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available