4.5 Article

Reconstructing Recognizable 3D Face Shapes based on 3D Morphable Models

Journal

COMPUTER GRAPHICS FORUM
Volume 41, Issue 6, Pages 348-364

Publisher

WILEY
DOI: 10.1111/cgf.14513

Keywords

facial modelling; modelling

Ask authors/readers for more resources

This study reconstructs distinctive 3D face shapes and introduces a novel shape identity-aware regularization (SIR) loss. By improving the discriminability in both shape parameter and shape geometry domains, our method achieves superior experimental results compared to existing methods.
Many recent works have reconstructed distinctive 3D face shapes by aggregating shape parameters of the same identity and separating those of different people based on parametric models (e.g. 3D morphable models (3DMMs)). However, despite the high accuracy in the face recognition task using these shape parameters, the visual discrimination of face shapes reconstructed from those parameters remains unsatisfactory. Previous works have not answered the following research question: Do discriminative shape parameters guarantee visual discrimination in represented 3D face shapes? This paper analyses the relationship between shape parameters and reconstructed shape geometry, and proposes a novel shape identity-aware regularization (SIR) loss for shape parameters, aiming at increasing discriminability in both the shape parameter and shape geometry domains. Moreover, to cope with the lack of training data containing both landmark and identity annotations, we propose a network structure and an associated training strategy to leverage mixed data containing either identity or landmark labels. In addition, since face recognition accuracy does not mean the recognizability of reconstructed face shapes from the shape parameters, we propose the SIR metric to measure the discriminability of face shapes. We compare our method with existing methods in terms of the reconstruction error, visual discriminability, and face recognition accuracy of the shape parameters and SIR metric. Experimental results show that our method outperforms the state-of-the-art methods. The code will be released at .

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available