4.6 Article

Divergence-based classification in learning vector quantization

期刊

NEUROCOMPUTING
卷 74, 期 9, 页码 1429-1435

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2010.10.016

关键词

Classification; Learning vector quantization; Prototype-based classifiers; Similarity measures; Distance measures

资金

  1. Federal Ministry of Education and Research, Germany [FZ: 0313833]
  2. German Research Foundation (DFG) [HA2719/4-1]
  3. Netherlands Organization for International Cooperation in Higher Education (NUFFIC) [NPT-UGA-238]

向作者/读者索取更多资源

We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study divergence based learning vector quantization (DLVQ). We derive cost function based DLVQ schemes for the family of gamma-divergences which includes the well-known Kullback-Leibler divergence and the so-called Cauchy-Schwarz divergence as special cases. The corresponding training schemes are applied to two different real world data sets. The first one, a benchmark data set (Wisconsin Breast Cancer) is available in the public domain. In the second problem, color histograms of leaf images are used to detect the presence of cassava mosaic disease in cassava plants. We compare the use of standard Euclidean distances with DLVQ for different parameter settings. We show that DLVQ can yield superior classification accuracies and Receiver Operating Characteristics. (C) 2011 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据