期刊
PATTERN RECOGNITION
卷 39, 期 4, 页码 635-645出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2005.09.004
关键词
classification; nearest neighbors; Cam distribution; distance measure
Nearest neighbor (NN) classification assumes locally constant class conditional probabilities, and suffers from bias in high dimensions with a small sample set. In this paper, we propose a novel cam weighted distance to ameliorate the curse of dimensionality. Different from the existing neighborhood-based methods which only analyze a small space emanating from the query sample, the proposed nearest neighbor classification using the cam weighted distance (CamNN) optimizes the distance measure based on the analysis of inter-prototype relationship. Our motivation comes from the observation that the prototypes are not isolated. Prototypes with different Surroundings should have different effects in the classification. The proposed cam weighted distance is orientation and scale adaptive to take advantage of the relevant information of inter-prototype relationship, so that a better classification performance can be achieved. Experiments show that CamNN significantly Outperforms one nearest neighbor classification (1-NN) and kappa-nearest neighbor classification (kappa-NN) in most benchmarks, while its computational complexity is comparable with that of 1-NN classification. (c) 2005 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据