3.8 Proceedings Paper

Multimodal Spontaneous Emotion Corpus for Human Behavior Analysis

出版社

IEEE
DOI: 10.1109/CVPR.2016.374

关键词

-

资金

  1. National Science Foundation [CNS-1205664, CNS-1205195]
  2. Direct For Computer & Info Scie & Enginr
  3. Division Of Computer and Network Systems [1205664, 1205195] Funding Source: National Science Foundation

向作者/读者索取更多资源

Emotion is expressed in multiple modalities, yet most research has considered at most one or two. This stems in part from the lack of large, diverse, well-annotated, multi-modal databases with which to develop and test algorithms. We present a well-annotated, multimodal, multidimensional spontaneous emotion corpus of 140 participants. Emotion inductions were highly varied. Data were acquired from a variety of sensors of the face that included high-resolution 3D dynamic imaging, high-resolution 2D video, and thermal (infrared) sensing, and contact physiological sensors that included electrical conductivity of the skin, respiration, blood pressure, and heart rate. Facial expression was annotated for both the occurrence and intensity of facial action units from 2D video by experts in the Facial Action Coding System (FACS). The corpus further includes derived features from 3D, 2D, and IR (infrared) sensors and baseline results for facial expression and action unit detection. The entire corpus will be made available to the research community.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据