4.7 Article

A Brain Network Analysis-Based Double Way Deep Neural Network for Emotion Recognition

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNSRE.2023.3236434

Keywords

Electroencephalography; Emotion recognition; Electrodes; Brain modeling; Motion pictures; Feature extraction; Task analysis; Brain network; deep residual neural network; electroencephalogram (EEG); emotion recognition; spearman correlation coefficient

Ask authors/readers for more resources

This article proposes a double way deep residual neural network combined with brain network analysis, allowing for the classification of multiple emotional states. The emotional EEG signals are transformed into frequency bands and brain networks are constructed. The features extracted from the two pathways are concatenated for classification, and the proposed model achieves high accuracy in emotion recognition tasks.
Constructing reliable and effective models to recognize human emotional states has become an important issue in recent years. In this article, we propose a double way deep residual neural network combined with brain network analysis, which enables the classification of multiple emotional states. To begin with, we transform the emotional EEG signals into five frequency bands by wavelet transform and construct brain networks by inter-channel correlation coefficients. These brain networks are then fed into a subsequent deep neural network block which contains several modules with residual connection and enhanced by channel attention mechanism and spatial attention mechanism. In the second way of the model, we feed the emotional EEG signals directly into another deep neural network block to extract temporal features. At the end of the two ways, the features are concatenated for classification. To verify the effectiveness of our proposed model, we carried out a series of experiments to collect emotional EEG from eight subjects. The average accuracy of the proposed model on our emotional dataset is 94.57%. In addition, the evaluation results on public databases SEED and SEED-IV are 94.55% and 78.91%, respectively, demonstrating the superiority of our model in emotion recognition tasks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available