4.7 Article

SincNet-Based Hybrid Neural Network for Motor Imagery EEG Decoding

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNSRE.2022.3156076

Keywords

Electroencephalography; Feature extraction; Convolution; Kernel; Convolutional neural networks; Filter banks; Task analysis; Brain-computer interface; motor imagery; SincNet; neural network; gated recurrent unit

Funding

  1. National Key Research and Development Program [2017YFB13003002]
  2. National Natural Science Foundation of China [61573142, 61773164]
  3. Program of Introducing Talents of Discipline to Universities (the 111 Project) [B17017]
  4. Shuguang Project - Shanghai Municipal Education Commission
  5. Shanghai Education Development Foundation [19SG25]
  6. Ministry of Education and Science of the Russian Federation [14.756.31.0001]
  7. Polish National Science Center [UMO-2016/20/W/NZ4/00354]

Ask authors/readers for more resources

This study proposes a hybrid neural network called SHNN for motor imagery-based brain-computer interfaces. The SHNN utilizes SincNet as band-pass filters to filter EEG data, and incorporates compression and excitation mechanisms, convolutional neural networks, and a gated recurrent unit module for deep feature representation and classification. The results show that the SHNN outperforms other state-of-the-art methods on the BCI competition IV datasets.
It is difficult to identify optimal cut-off frequencies for filters used with the common spatial pattern (CSP) method in motor imagery (MI)-based brain-computer interfaces (BCIs). Most current studies choose filter cut-frequencies based on experience or intuition, resulting in sub-optimal use of MI-related spectral information in the electroencephalography (EEG). To improve information utilization, we propose a SincNet-based hybrid neural network (SHNN) for MI-based BCIs. First, raw EEG is segmented into different time windows and mapped into the CSP feature space. Then, SincNets are used as filter bank band-pass filters to automatically filter the data. Next, we used squeeze-and-excitation modules to learn a sparse representation of the filtered data. The resulting sparse data were fed into convolutional neural networks to learn deep feature representations. Finally, these deep features were fed into a gated recurrent unit module to seek sequential relations, and a fully connected layer was used for classification. We used the BCI competition IV datasets 2a and 2b to verify the effectiveness of our SHNN method. The mean classification accuracies (kappa values) of our SHNN method are 0.7426 (0.6648) on dataset 2a and 0.8349 (0.6697) on dataset 2b, respectively. The statistical test results demonstrate that our SHNN can significantly outperform other state-of-the-art methods on these datasets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available