期刊
IEEE TRANSACTIONS ON INFORMATION THEORY
卷 57, 期 7, 页码 4689-4708出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2011.2146690
关键词
Estimation theory; feature selection; greedy algorithms; sparse recovery; statistical learning
资金
- NSF [DMS-1007527, IIS-1016061]
- [AFOSR-10097389]
- [NSA-AMS 081024]
Given a large number of basis functions that can be potentially more than the number of samples, we consider the problem of learning a sparse target function that can be expressed as a linear combination of a small number of these basis functions. We are interested in two closely related themes: feature selection, or identifying the basis functions with nonzero coefficients; estimation accuracy, or reconstructing the target function from noisy observations. Two heuristics that are widely used in practice are forward and backward greedy algorithms. First, we show that neither idea is adequate. Second, we propose a novel combination that is based on the forward greedy algorithm but takes backward steps adaptively whenever beneficial. For least squares regression, we develop strong theoretical results for the new procedure showing that it can effectively solve this problem under some assumptions. Experimental results support our theory.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据