4.7 Article

Efficiently updating and tracking the dominant kernel principal components

期刊

NEURAL NETWORKS
卷 20, 期 2, 页码 220-229

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2006.09.012

关键词

dominant eigenspace; eigenvalues; updating; tracking; kernel gram matrix; prinicipal components; large scale data

向作者/读者索取更多资源

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit these methods in numerous applications. In this paper we present an efficient incremental approach for fast calculation of the dominant kernel eigenbasis, which allows us to track the kernel eigenspace dynamically. Experiments show that our updating scheme delivers a numerically stable and accurate approximation for eigenvalues and eigenvectors at every iteration in comparison to the batch algorithm. (c) 2006 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据