4.8 Article

Generative Local Metric Learning for Nearest Neighbor Classification

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2017.2666151

关键词

Metric learning; nearest neighbor classification; f-divergence; generative-discriminative hybridization

资金

  1. U.S. Air Force Office of Scientific Research
  2. National Security Research Institute in Korea
  3. SNU-MAE BK21+program
  4. Korea government [IITP-R0126-16-1072, KEIT-10044009, KEIT-10060086]
  5. U.S. National Science Foundation
  6. U.S. Department of Transportation
  7. U.S. Army Research Laboratory
  8. U.S. Office of Naval Research

向作者/读者索取更多资源

We consider the problem of learning a local metric in order to enhance the performance of nearest neighbor classification. Conventional metric learning methods attempt to separate data distributions in a purely discriminative manner; here we show how to take advantage of information from parametric generative models. We focus on the bias in the information-theoretic error arising from finite sampling effects, and find an appropriate local metric that maximally reduces the bias based upon knowledge from generative models. As a byproduct, the asymptotic theoretical analysis in this work relates metric learning to dimensionality reduction from a novel perspective, which was not understood from previous discriminative approaches. Empirical experiments show that this learned local metric enhances the discriminative nearest neighbor performance on various datasets using simple class conditional generative models such as a Gaussian.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据