4.8 Article

Multimodal Face-Pose Estimation With Multitask Manifold Deep Learning

Journal

IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS
Volume 15, Issue 7, Pages 3952-3961

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TII.2018.2884211

Keywords

Convolutional neural networks (CNNs); face-pose estimation; low-rank learning; multitask learning

Funding

  1. National Natural Science Foundation of China [61622205, 61836002]
  2. Zhejiang Provincial Natural Science Foundation of China [LY17F020009]
  3. Fujian Provincial Natural Science Foundation of China [2018J01573]
  4. Fujian Provincial High School Natural Science Foundation of China [JZ160472]
  5. Foundation of Fujian Educational Committee [JAT160357, TII-18-1874]

Ask authors/readers for more resources

Face-pose estimation aims at estimating the gazing direction with two-dimensional face images. It gives important communicative information and visual saliency. However, it is challenging because of lights, background, face orientations, and appearance visibility. Therefore, a descriptive representation of face images and mapping it to poses are critical. In this paper, we use multimodal data and propose a novel face-pose estimation framework named multitask manifold deep learning ((MDL)-D-2). It is based on feature extractionwith improved convolutional neural networks (CNNs) and multimodal mapping relationship with multitask learning. In the proposed CNNs, manifold regularized convolutional layers learn the relationship between outputs of neurons in a low-rank space. Besides, in the proposed mapping relationship learning method, different modals of face representations are naturally combined by applying multitask learning with incoherent sparse and low-rank learning with a least-squares loss. Experimental results on three challenging benchmark datasets demonstrate the performance of (MDL)-D-2.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available