4.6 Article

Fusion Graph Representation of EEG for Emotion Recognition

期刊

SENSORS
卷 23, 期 3, 页码 -

出版社

MDPI
DOI: 10.3390/s23031404

关键词

emotion recognition; EEG; graph convolutional network; feature fusion

向作者/读者索取更多资源

This paper proposes a fusion graph convolutional network (FGCN) to extract various relations existing in Electroencephalogram (EEG) data for emotion recognition. By mining brain connection features and fusing different relation graphs, the accuracy of emotion recognition can be significantly improved.
Various relations existing in Electroencephalogram (EEG) data are significant for EEG feature representation. Thus, studies on the graph-based method focus on extracting relevancy between EEG channels. The shortcoming of existing graph studies is that they only consider a single relationship of EEG electrodes, which results an incomprehensive representation of EEG data and relatively low accuracy of emotion recognition. In this paper, we propose a fusion graph convolutional network (FGCN) to extract various relations existing in EEG data and fuse these extracted relations to represent EEG data more comprehensively for emotion recognition. First, the FGCN mines brain connection features on topology, causality, and function. Then, we propose a local fusion strategy to fuse these three graphs to fully utilize the valuable channels with strong topological, causal, and functional relations. Finally, the graph convolutional neural network is adopted to represent EEG data for emotion recognition better. Experiments on SEED and SEED-IV demonstrate that fusing different relation graphs are effective for improving the ability in emotion recognition. Furthermore, the emotion recognition accuracy of 3-class and 4-class is higher than that of other state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据