4.7 Article

Nonparametric Sparsity and Regularization

期刊

JOURNAL OF MACHINE LEARNING RESEARCH
卷 14, 期 -, 页码 1665-1714

出版社

MICROTOME PUBL

关键词

sparsity; nonparametric; variable selection; regularization; proximal methods; RKHS

资金

  1. Integrated Project Health-e-Child [IST-2004-027749]
  2. DARPA
  3. National Science Foundation [NSF-0640097, NSF-0827427]
  4. Compagnia di San Paolo, Torino
  5. Adobe
  6. Honda Research Institute USA
  7. King Abdullah University Science and Technology
  8. Eugene McDermott Foundation

向作者/读者索取更多资源

In this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the model by making use of partial derivatives. Based on this intuition we propose a new notion of nonparametric sparsity and a corresponding least squares regularization scheme. Using concepts and results from the theory of reproducing kernel Hilbert spaces and proximal methods, we show that the proposed learning algorithm corresponds to a minimization problem which can be provably solved by an iterative procedure. The consistency properties of the obtained estimator are studied both in terms of prediction and selection performance. An extensive empirical analysis shows that the proposed method performs favorably with respect to the state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据