4.7 Article

Linking Attention-Based Multiscale CNN With Dynamical GCN for Driving Fatigue Detection

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2020.3047502

Keywords

Attention-based multiscale convolutional neural network (CNN); driving fatigue; dynamical graph convolution network (GCN); electroencephalography (EEG); spatiotemporal structure

Funding

  1. Special Projects in Key Fields by Technology Development Project of Guangdong Province [2020ZDZX3018]
  2. Special Fund for Science and Technology of Guangdong Province [2020182]
  3. Wuyi University and Hong Kong & Macao joint Research and Development Project [2019WGALH16]
  4. Science Foundation for Young Teachers of Wuyi University [2018td01]
  5. Jiangmen Brain-like Computation and Hybrid Intelligence Research and Development Center [[2018]359, [2019]26]
  6. Scientific Research Startup Funds for High-Level Talents of Wuyi University [2020AL006]
  7. Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging [SZD201909]
  8. Science, Technology and Innovation Commission of Shenzhen Municipality Technology Fund [JCYJ20170818093322718]
  9. National Natural Science Foundation [81871443]

Ask authors/readers for more resources

This article introduces a new AMCNN-DGCN model, which automatically learns frequency filters through multiscale temporal convolutions to extract salient patterns from EEG data, and then uses DGCNs to learn spatial filters effectively, capturing highly discriminative features. Experimental results show that AMCNN-DGCN achieves a high accuracy of 95.65%, outperforming six widely used competitive EEG models.
Electroencephalography (EEG) signals have been proven to be one of the most predictive and reliable indicators for estimating driving fatigue state. However, how to make full use of EEG data for driving fatigue detection remains a challenge. Many existing methods include a time-consuming manual process or tedious parameter tunings for feature extraction, which is inconvenient to train and implement. On the other hand, most models ignore or manually determine EEG connectivity features between different channels, thus failing to thoroughly exploit the intrinsic interchannel relations for classification. In this article, we introduce a new attention-based multiscale convolutional neural network-dynamical graph convolutional network (AMCNN-DGCN) model, aiming to conquer these two issues in a unified end-to-end model. AMCNN-DGCN starts with attention-based multiscale temporal convolutions to automatically learn frequency filters to extract the salient pattern from raw EEG data. Subsequently, AMCNN-DGCN uses dynamical graph convolutional networks (DGCNs) to learn spatial filters, in which the adjacency matrix is adaptively determined in a data-driven way to exploit the intrinsic relationship between channels effectively. With the temporal-spatial structure, AMCNN-DCCN can capture highly discriminative features. To verify the effectiveness of AMCNN-DGCN, we conduct a simulated fatigue driving environment to collect EEG signals from 29 healthy subjects (male/female = 17/12 and age = 23.28 +/- 2.70 years) through a remote wireless cap with 24 channels. The results demonstrate that our proposed model outperforms six widely used competitive EEG models with high accuracy of 95.65%. Finally, the critical brain regions and connections for driving fatigue detection were investigated through the dynamically learned adjacency matrix.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available