4.6 Article

A new information theoretic analysis of sum-of-squared-error kernel clustering

Journal

NEUROCOMPUTING
Volume 72, Issue 1-3, Pages 23-31

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2008.03.017

Keywords

Information theory; Renyi entropy; Sum-of-squared-error clustering; K-means; Mercer kernels; Parzen windowing

Ask authors/readers for more resources

The contribution of this paper is to provide a new input space analysis of the properties of sum-of-squared-error K-means clustering performed in a Mercer kernel feature space. Such an analysis has been missing until now, even though kernel K-means has, been popular in the clustering literature. Our derivation extends the theory of traditional K-means from properties of mean vectors to information theoretic properties of Parzen window estimated probability density functions (pdfs). In particular, Euclidean distance-based kernel K-means is shown to maximize an integrated squared error divergence measure between cluster pdfs and the overall pdf of the data, while a cosine similarity-based approach maximizes a Cauchy-Schwarz divergence measure. Furthermore, the iterative rules which assign data points to clusters in order to maximize these criteria are shown to depend on the cluster pdfs evaluated at the data points, in addition to the Renyi entropies of the clusters. The Bayes rule is shown to be a special case. (c) 2008 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available