4.7 Article

A nearest-neighbor search model for distance metric learning

Journal

INFORMATION SCIENCES
Volume 552, Issue -, Pages 261-277

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.11.054

Keywords

Distance metric learning; Nearest-neighbor selection; Search model; k Nearest-neighbor

Funding

  1. Natural Science Foundation of China [61876044, 62076074]
  2. Guangdong Natural Science Foundation [2020A1515010670, 2020A1515011501]
  3. Science and Technology Planning Project of Guangzhou [202002030141]

Ask authors/readers for more resources

The paper introduces a nearest-neighbor search model for distance metric learning (NNS-DML), which constructs metric optimization constraints by searching different optimal nearest-neighbor numbers for each training instance. This model reduces the influence of irrelevant features on similar and dissimilar instance pairs and develops a k-free nearest-neighbor model for classification problems. Extensive experiments show that NNS-DML outperforms state-of-the-art distance metric learning methods.
Distance metric learning aims to deal with the data distribution by learning a suitable distance metric from the training instances. For distance metric learning, the optimization constraints can be constructed based on the similar and dissimilar instance pairs. The instance pairs are generated by selecting the nearest-neighbors for each training instance. However, most methods select the same and fixed nearest-neighbor number for different training instances, which may limit performance for learning distance metric. In this paper, we propose a nearest-neighbor search model for distance metric learning (NNS-DML), which is capable of constructing the metric optimization constraints by searching different optimal nearest-neighbor numbers for different training instances. Specifically, we formulate a nearest-neighbor search matrix to contain the nearest-neighbor correlations of all training instances. Using the search matrix, we can construct and weight the metric optimization constraints of each training instance, such that the influence of its irrelevant features for its corresponding similar and dissimilar instance pairs can be reduced. Moreover, we develop a k-free nearest-neighbor model for classification problems via the SVM solver, which can ignore the setting of k. Extensive experiments show that the proposed NNS-DML method outperforms the state-of-the-art distance metric learning methods. (C) 2020 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available