4.7 Article

Metric learning via perturbing hard-to-classify instances

Journal

PATTERN RECOGNITION
Volume 132, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2022.108928

Keywords

Metric learning; Hard -to -classify instances; Instance perturbation; Alternating minimization

Funding

  1. National Key Research and De- velopment Program of China [2019YFE0118200]
  2. National Nat- ural Science Foundation of China [62006147, 61976184]

Ask authors/readers for more resources

Constraint selection is an effective approach to deal with the problem of excessive constraints in metric learning. This article introduces a novel metric learning algorithm, ML-PHI, which perturbs hard-to-classify instances to reduce over-fitting and mitigate the negative impact of those instances.
Constraint selection is an effective means to alleviate the problem of a massive amount of constraints in metric learning. However, it is difficult to find and deal with all association constraints with the same hard-to-classify instance (i.e., an instance surrounded by dissimilar instances), negatively affecting met-ric learning algorithms. To address this problem, we propose a new metric learning algorithm from the perspective of selecting instances, Metric Learning via Perturbing of Hard-to-classify Instances (ML-PHI), which directly perturbs the hard-to-classify instances to reduce over-fitting for the hard-to-classify in-stances. ML-PHI perturbs hard-to-classify instances to be closer to similar instances while keeping the positions of the remaining instances as constant as possible. As a result, the negative impacts of hard -to-classify instances are effectively reduced. We have conducted extensive experiments on real data sets, and the results show that ML-PHI is effective and outperforms state-of-the-art methods.(c) 2022 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available