4.6 Article

Investigating EEG-based functional connectivity patterns for multimodal emotion recognition

Journal

JOURNAL OF NEURAL ENGINEERING
Volume 19, Issue 1, Pages -

Publisher

IOP Publishing Ltd
DOI: 10.1088/1741-2552/ac49a7

Keywords

affective brain-computer interface; EEG; eye movement; brain functional connectivity network; multimodal emotion recognition; multimodal deep learning

Funding

  1. National Natural Science Foundation of China [61976135]
  2. SJTU Trans-Med Awards Research [WF540162605]
  3. Fundamental Research Funds for the Central Universities
  4. 111 Project
  5. GuangCi Professorship Program of RuiJin Hospital Shanghai Jiao Tong University School of Medicine

Ask authors/readers for more resources

This study proposes a novel algorithm for selecting emotion-relevant critical subnetworks and investigates three EEG functional connectivity network features. The results show that these EEG connectivity features achieve high classification accuracy in emotion recognition.
Objective. Previous studies on emotion recognition from electroencephalography (EEG) mainly rely on single-channel-based feature extraction methods, which ignore the functional connectivity between brain regions. Hence, in this paper, we propose a novel emotion-relevant critical subnetwork selection algorithm and investigate three EEG functional connectivity network features: strength, clustering coefficient, and eigenvector centrality. Approach. After constructing the brain networks by the correlations between pairs of EEG signals, we calculated critical subnetworks through the average of brain network matrices with the same emotion label to eliminate the weak associations. Then, three network features were conveyed to a multimodal emotion recognition model using deep canonical correlation analysis along with eye movement features. The discrimination ability of the EEG connectivity features in emotion recognition is evaluated on three public datasets: SEED, SEED-V, and DEAP. Main results. The experimental results reveal that the strength feature outperforms the state-of-the-art features based on single-channel analysis. The classification accuracies of multimodal emotion recognition are 95.08 +/- 6.42% on the SEED dataset, 84.51 +/- 5.11% on the SEED-V dataset, and 85.34 +/- 2.90% 86.61 +/- 3.76% for arousal and valence on the DEAP dataset, respectively, which all achieved the best performance. In addition, the brain networks constructed with 18 channels achieve comparable performance with that of the 62-channel network and enable easier setups in real scenarios. Significance. The EEG functional connectivity networks combined with emotion-relevant critical subnetworks selection algorithm we proposed is a successful exploration to excavate the information between channels.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available