4.5 Article

Sample hardness guided softmax loss for face recognition

Journal

APPLIED INTELLIGENCE
Volume 53, Issue 3, Pages 2640-2655

Publisher

SPRINGER
DOI: 10.1007/s10489-022-03504-5

Keywords

Face recognition; Feature extraction; Data mining; Machine learning; Cost function

Ask authors/readers for more resources

This paper proposes a novel loss function, called Hardness Loss, which adaptively assigns weights to misclassified hard samples by considering multiple training status and feature position information. Experimental results demonstrate that the proposed method outperforms state-of-the-art approaches in various face recognition scenarios.
Face recognition (FR) has received remarkable attention for improving feature discrimination with the development of deep convolutional neural networks (CNNs). Although the existing methods have achieved great success in designing margin-based loss functions by using hard sample mining strategy, they still suffer from two issues: 1) the neglect of some training status and feature position information and 2) inaccurate weight assignment for hard samples due to the coarse hardness description. To solve these issues, we develop a novel loss function, namely Hardness Loss, to adaptively assign weights for the misclassified (hard) samples guided by their corresponding hardness, which accounts for multiple training status and feature position information. Specifically, we propose an estimator to provide the real-time training status to precisely compute the hardness for weight assignment. To the best of our knowledge, this is the first attempt to design a loss function by using multiple pieces of information about the training status and feature positions. Extensive experiments on popular face benchmarks demonstrate that the proposed method is superior to the state-of-the-art (SOTA) losses under various FR scenarios.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available