期刊
IEEE TRANSACTIONS ON INFORMATION THEORY
卷 64, 期 3, 页码 1738-1766出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2017.2773474
关键词
Mixture models; statistical learning; minimax techniques; information theory
资金
- NSF [CCF-1704828, EECS-1056028, CNS-1302435, CCF-1116955]
- CRII [1657420]
- School of Operations Research and Information Engineering, Cornell University
- USDOT UTC-D-STOP Center, UT-Austin
- Directorate For Engineering
- Div Of Electrical, Commun & Cyber Sys [1056028] Funding Source: National Science Foundation
We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, as well as a nonconvex formulation that works under more general settings and remains tractable. Upper bounds are provided on the recovery errors for both arbitrary noise and stochastic noise models. We also give matching minimax lower bounds (up to log factors), showing that our algorithm is information-theoretical optimal in a precise sense. Our results represent the first tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity. Moreover, we pinpoint the statistical cost of mixtures: our minimax-optimal results indicate that the mixture poses a fundamentally more difficult problem in the low-SNR regime, where the learning rate changes.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据