4.8 Article

Diffusion maps and coarse-graining: A unified framework for dimensionality reduction, graph partitioning, and data set parameterization

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2006.184

Keywords

machine learning; text analysis; knowledge retrieval; quantization; graph-theoretic methods; compression (coding); clustering; clustering similarity measures; information visualization; Markov processes; graph algorithms

Ask authors/readers for more resources

We provide evidence that nonlinear dimensionality reduction, clustering, and data set parameterization can be solved within one and the same framework. The main idea is to define a system of coordinates with an explicit metric that reflects the connectivity of a given data set and that is robust to noise. Our construction, which is based on a Markov random walk on the data, offers a general scheme of simultaneously reorganizing and subsampling graphs and arbitrarily shaped data sets in high dimensions using intrinsic geometry. We show that clustering in embedding spaces is equivalent to compressing operators. The objective of data partitioning and clustering is to coarse-grain the random walk on the data while at the same time preserving a diffusion operator for the intrinsic geometry or connectivity of the data set up to some accuracy. We show that the quantization distortion in diffusion space bounds the error of compression of the operator, thus giving a rigorous justification fork-means clustering in diffusion space and a precise measure of the performance of general clustering algorithms.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available