期刊
PATTERN RECOGNITION
卷 96, 期 -, 页码 -出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2019.106966
关键词
Expression recognition; Feature sparseness; Deep metric learning; Fine tuning; Generalization capability
资金
- Natural Science Foundation of China [61602315, 61672357]
- Science and Technology Innovation Commission of Shenzhen [JCYJ20170302153827712]
- Tencent Rhinoceros Birds-Scientific Research Foundation for Young Teachers of Shenzhen University
- Shenzhen University [2018063]
While weight sparseness-based regularization has been used to learn better deep features for image recognition problems, it introduced a large number of variables for optimization and can easily converge to a local optimum. The L2-norm regularization proposed for face recognition reduces the impact of the noisy information, while expression information is also suppressed during the regularization. A feature sparseness-based regularization that learns deep features with better generalization capability is proposed in this paper. The regularization is integrated into the loss function and optimized with a deep metric learning framework. Through a toy example, it is showed that a simple network with the proposed sparseness outperforms the one with the L2-norm regularization. Furthermore, the proposed approach achieved competitive performances on four publicly available datasets, i.e., FER2013, CK+, Oulu-CASIA and MMI. The state-of-the-art cross-database performances also justify the generalization capability of the proposed approach. (C) 2019 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据