4.7 Review

A review of Nystrom methods for large-scale machine learning

期刊

INFORMATION FUSION
卷 26, 期 -, 页码 36-48

出版社

ELSEVIER
DOI: 10.1016/j.inffus.2015.03.001

关键词

Low-rank approximation; Nystrom method; Sampling method; Machine learning

资金

  1. National Natural Science Foundation of China [61370175]
  2. Shanghai Knowledge Service Platform Project [ZF1213]

向作者/读者索取更多资源

Generating a low-rank matrix approximation is very important in large-scale machine learning applications. The standard Nystrom method is one of the state-of-the-art techniques to generate such an approximation. It has got rapid developments since being applied to Gaussian process regression. Several enhanced Nystrom methods such as ensemble Nystrom, modified Nystrom and SS-Nystrom have been proposed. In addition, many sampling methods have been developed. In this paper, we review the Nystrom methods for large-scale machine learning. First, we introduce various Nystrom methods. Second, we review different sampling methods for the Nystrom methods and summarize them from the perspectives of both theoretical analysis and practical performance. Then, we list several typical machine learning applications that utilize the Nystrom methods. Finally, we make our conclusions after discussing some open machine learning problems related to Nystrom methods. (C) 2015 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据