期刊
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS
卷 21, 期 1, 页码 128-134出版社
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.acha.2006.03.004
关键词
-
The convergence of the discrete graph Laplacian to the continuous manifold Laplacian in the limit of sample size N --> infinity while the kernel bandwidth epsilon --> 0, is the justification for the success of Laplacian based algorithms in machine learning, such as dimensionality reduction, semi-supervised learning and spectral clustering. In this paper we improve the convergence rate of the variance term recently obtained by Hein et al. [From graphs to manifolds-Weak and strong pointwise consistency of graph Laplacians, in: P. Auer, R. Meir (Eds.), Proc. 18th Conf. Learning Theory (COLT), Lecture Notes Comput. Sci., vol. 3559, Springer-Verlag, Berlin, 2005, pp. 470-485], improve the bias term error, and find an optimal criteria to determine the parameter E given N. (C) 2006 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据