4.6 Article

Intensity enhancement via GAN for multimodal face expression recognition

Journal

NEUROCOMPUTING
Volume 454, Issue -, Pages 124-134

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2021.05.022

Keywords

Face expression recognition; Intensity enhancement; Generative Adversarial Network

Funding

  1. National Natural Science Foundation of China [U20B2069, 61673033]
  2. China Postdoctoral Science Foundation [2019M660406]

Ask authors/readers for more resources

This paper proposes a novel multimodal approach based on GAN that jointly models intensity enhancement and expression recognition, resulting in improved FER performance. Experimental results on multiple datasets validate the effectiveness of the method in both low expression intensity and general FER scenarios.
Face expression recognition (FER) on low expression intensity is not well studied in the literature. This paper investigates this problem and presents a novel Generative Adversarial Network (GAN) based multimodal approach to it. The method models the tasks of intensity enhancement and expression recognition jointly, ensuring that the synthesize faces not only present expression of high intensity, but also truly promote the performance of FER. The proposed model is flexible enough that faces can be expressed in various formats, such as RGB image, depth maps, 3D point-clouds, etc., so that complementarity of texture and geometry clues can be further exploited. Extensive experiments are conducted on the BU-3DFE, BU-4DFE, Oulu-CASIA and CK+ datasets. State-of-the-art FER performance is achieved for not only the circumstance of low expression intensities, but also the general FER scenarios, clearly validating the effectiveness of the proposed method. (c) 2021 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available