4.6 Article

Fast neighborhood component analysis

期刊

NEUROCOMPUTING
卷 83, 期 -, 页码 31-37

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2011.10.021

关键词

Metric learning; Nearest neighbor; Dimensionality reduction; Kernel method

资金

  1. National Natural Science Foundation of China [60872099, 60902099]

向作者/读者索取更多资源

Distance metric is of considerable importance in varieties of machine learning and pattern recognition applications. Neighborhood component analysis (NCA), one of the most successful metric learning algorithms, suffers from the high computational cost, which makes it only suitable for small-scale classification tasks. To overcome this disadvantage, we proposed a fast neighborhood component analysis (FNCA) method. For a given sample, FNCA adopts a local probability distribution model constructed based on its K nearest neighbors from the same class and from the different classes. We further extended FNCA to nonlinear metric learning scenarios using the kernel trick. Experimental results show that, compared with NCA, FNCA not only significantly increases the training speed but also obtains higher classification accuracy. Furthermore, comparative studies with the state-of-the-art approaches on various real-world datasets also verify the effectiveness of the proposed linear and nonlinear FNCA methods. (C) 2011 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据