4.7 Article

Facial Expression Recognition with Identity and Emotion Joint Learning

Journal

IEEE TRANSACTIONS ON AFFECTIVE COMPUTING
Volume 12, Issue 2, Pages 544-550

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TAFFC.2018.2880201

Keywords

Face recognition; Face; Task analysis; Feature extraction; Convolution; Emotion recognition; Training data; Facial expression recognition; emotion recognition; face recognition; joint learning; transfer learning

Funding

  1. National Natural Science Foundation of China [61773413]
  2. Natural Science Foundation of Guangzhou City [201707010363]
  3. Science and Technology Program of Guangzhou City
  4. Six talent peaks project in Jiangsu Province [JY-074]

Ask authors/readers for more resources

This study introduces a deep learning approach that combines identity features and emotional features to enhance facial expression recognition tasks. Experimental results demonstrate high accuracy rates on different databases, outperforming baseline methods and state-of-the-art techniques.
Different subjects may express a specific expression in different ways due to inter-subject variabilities. In this work, besides training deep-learned facial expression feature (emotional feature), we also consider the influence of latent face identity feature such as the shape or appearance of face. We propose an identity and emotion joint learning approach with deep convolutional neural networks (CNNs) to enhance the performance of facial expression recognition (FER) tasks. First, we learn the emotion and identity features separately using two different CNNs with their corresponding training data. Second, we concatenate these two features together as a deep-learned Tandem Facial Expression (TFE) Feature and feed it to the subsequent fully connected layers to form a new model. Finally, we perform joint learning on the newly merged network using only the facial expression training data. Experimental results show that our proposed approach achieves 99.31 and 84.29 percent accuracy on the CK+ and the FER+ database, respectively, which outperforms the residual network baseline as well as many other state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available