4.7 Article

Learning discriminative features via weights-biased softmax loss

Journal

PATTERN RECOGNITION
Volume 107, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2020.107405

Keywords

Classification; Softmax; CNNs; Fully connected layer units; Classifier weights

Funding

  1. National Key R&D Program of China [2017YFB1002203]
  2. NSFC [61976201, 61860206004]
  3. Ningbo 2025 Key Project of Science and Technology Innovation [2018B10071]

Ask authors/readers for more resources

Loss functions play a key role in training superior deep neural networks. In convolutional neural networks (CNNs), the popular cross entropy loss together with softmax does not explicitly guarantee minimization of intra-class variance or maximization of inter-class variance. In the early studies, there is no theoretical analysis and experiments explicitly indicating how to choose the number of units in fully connected layer. To help CNNs learn features more fast and discriminative, there are two contributions in this paper. First, we determine the minimum number of units in FC layer by rigorous theoretical analysis and extensive experiment, which reduces CNNs' parameter memory and training time. Second, we propose a negative-focused weights-biased softmax (W-Softmax) loss to help CNNs learn more discriminative features. The proposed W-Softmax loss not only theoretically formulates the intra-class compactness and inter-class separability, but also can avoid overfitting by enlarging decision margins. Moreover, the size of decision margins can be flexibly controlled by adjusting a hyperparameter alpha. Extensive experimental results on several benchmark datasets show the superiority of W-Softmax in image classification tasks. (C) 2020 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available