期刊
NEUROCOMPUTING
卷 83, 期 -, 页码 31-37出版社
ELSEVIER
DOI: 10.1016/j.neucom.2011.10.021
关键词
Metric learning; Nearest neighbor; Dimensionality reduction; Kernel method
资金
- National Natural Science Foundation of China [60872099, 60902099]
Distance metric is of considerable importance in varieties of machine learning and pattern recognition applications. Neighborhood component analysis (NCA), one of the most successful metric learning algorithms, suffers from the high computational cost, which makes it only suitable for small-scale classification tasks. To overcome this disadvantage, we proposed a fast neighborhood component analysis (FNCA) method. For a given sample, FNCA adopts a local probability distribution model constructed based on its K nearest neighbors from the same class and from the different classes. We further extended FNCA to nonlinear metric learning scenarios using the kernel trick. Experimental results show that, compared with NCA, FNCA not only significantly increases the training speed but also obtains higher classification accuracy. Furthermore, comparative studies with the state-of-the-art approaches on various real-world datasets also verify the effectiveness of the proposed linear and nonlinear FNCA methods. (C) 2011 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据