4.7 Article

Facial Expression Recognition with Identity and Emotion Joint Learning

期刊

IEEE TRANSACTIONS ON AFFECTIVE COMPUTING
卷 12, 期 2, 页码 544-550

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TAFFC.2018.2880201

关键词

Face recognition; Face; Task analysis; Feature extraction; Convolution; Emotion recognition; Training data; Facial expression recognition; emotion recognition; face recognition; joint learning; transfer learning

资金

  1. National Natural Science Foundation of China [61773413]
  2. Natural Science Foundation of Guangzhou City [201707010363]
  3. Science and Technology Program of Guangzhou City
  4. Six talent peaks project in Jiangsu Province [JY-074]

向作者/读者索取更多资源

This study introduces a deep learning approach that combines identity features and emotional features to enhance facial expression recognition tasks. Experimental results demonstrate high accuracy rates on different databases, outperforming baseline methods and state-of-the-art techniques.
Different subjects may express a specific expression in different ways due to inter-subject variabilities. In this work, besides training deep-learned facial expression feature (emotional feature), we also consider the influence of latent face identity feature such as the shape or appearance of face. We propose an identity and emotion joint learning approach with deep convolutional neural networks (CNNs) to enhance the performance of facial expression recognition (FER) tasks. First, we learn the emotion and identity features separately using two different CNNs with their corresponding training data. Second, we concatenate these two features together as a deep-learned Tandem Facial Expression (TFE) Feature and feed it to the subsequent fully connected layers to form a new model. Finally, we perform joint learning on the newly merged network using only the facial expression training data. Experimental results show that our proposed approach achieves 99.31 and 84.29 percent accuracy on the CK+ and the FER+ database, respectively, which outperforms the residual network baseline as well as many other state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据