4.7 Article

Inter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video Watching

期刊

IEEE TRANSACTIONS ON AFFECTIVE COMPUTING
卷 12, 期 1, 页码 92-102

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TAFFC.2018.2849758

关键词

Emotion; inter-brain; EEG; implicit tagging

资金

  1. National Key Research and Development Plan [2016YFB1001200]
  2. National Science Foundation of China [U1736220, 61725204]
  3. National Social Science Foundation of China [17ZDA323]
  4. MOE (Ministry of Education, China) Project of Humanities and Social Sciences [17YJA190017]
  5. Tsinghua University Initiative Scientific Research Program [2014z21043]

向作者/读者索取更多资源

This paper presents an EEG-based real-time emotion tagging approach by extracting inter-brain features from participants watching the same emotional video clips. The inter-brain amplitude feature had better prediction performance than other features, with regression values obtained for arousal and valence. The results show promising potentials of inter-brain EEG features in real-time emotion tagging applications.
How to efficiently tag the emotional experience of multimedia contents is an important and challenging problem in the field of affective computing. This paper presents an EEG-based real-time emotion tagging approach, by extracting inter-brain features from a group of participants when they watch the same emotional video clips. First, the continuous subjective reports on both the arousal and valence dimensions of emotion were obtained by employing a three-round behavioral rating paradigm. Second, the inter-brain features were systematically explored in both spectral and temporal domain. Finally, regression analyses were performed to evaluate the effectiveness of inter-brain amplitude and phase features. The inter-brain amplitude feature showed significantly better prediction performance than the inter-brain phase feature, as well as another two conventional features (spectral power and inter-subject correlation). By combining the four types of features, regression values (R-2) were obtained for the prediction of arousal (0.61 +/- 0.01) and valence (0.70 +/- 0.01), corresponding to prediction errors of 1.01 +/- 0.02 and 0.78 +/- 0.02 (unit on 9-point scales), respectively. The contributions of different electrodes and frequency bands were also analyzed. Our results show promising potentials of inter-brain EEG features in real-time emotion tagging applications.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据