4.6 Article

Divergence-based classification in learning vector quantization

Journal

NEUROCOMPUTING
Volume 74, Issue 9, Pages 1429-1435

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2010.10.016

Keywords

Classification; Learning vector quantization; Prototype-based classifiers; Similarity measures; Distance measures

Funding

  1. Federal Ministry of Education and Research, Germany [FZ: 0313833]
  2. German Research Foundation (DFG) [HA2719/4-1]
  3. Netherlands Organization for International Cooperation in Higher Education (NUFFIC) [NPT-UGA-238]

Ask authors/readers for more resources

We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study divergence based learning vector quantization (DLVQ). We derive cost function based DLVQ schemes for the family of gamma-divergences which includes the well-known Kullback-Leibler divergence and the so-called Cauchy-Schwarz divergence as special cases. The corresponding training schemes are applied to two different real world data sets. The first one, a benchmark data set (Wisconsin Breast Cancer) is available in the public domain. In the second problem, color histograms of leaf images are used to detect the presence of cassava mosaic disease in cassava plants. We compare the use of standard Euclidean distances with DLVQ for different parameter settings. We show that DLVQ can yield superior classification accuracies and Receiver Operating Characteristics. (C) 2011 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available