4.7 Article

Generalized relevance learning vector quantization

Journal

NEURAL NETWORKS
Volume 15, Issue 8-9, Pages 1059-1068

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/S0893-6080(02)00079-5

Keywords

clustering; learning vector quantization; adaptive metric; relevance determination

Ask authors/readers for more resources

We propose a new scheme for enlarging generalized learning vector quantization (GLVQ) with weighting factors for the input dimensions. The factors allow an appropriate scaling of the input dimensions according to their relevance. They are adapted automatically during training according to the specific classification task whereby training can be interpreted as stochastic gradient descent on an appropriate error function. This method leads to a more powerful classifier and to an adaptive metric with little extra cost compared to standard GLVQ. Moreover, the size of the weighting factors indicates the relevance of the input dimensions. This proposes a scheme for automatically pruning irrelevant input dimensions. The algorithm is verified on artificial data sets and the iris data from the UCI repository. Afterwards, the method is compared to several well known algorithms which determine the intrinsic data dimension on real world satellite image data. (C) 2002 Elsevier Science Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available