4.5 Article

Convex and Nonconvex Formulations for Mixed Regression With Two Components: Minimax Optimal Rates

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 64, 期 3, 页码 1738-1766

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2017.2773474

关键词

Mixture models; statistical learning; minimax techniques; information theory

资金

  1. NSF [CCF-1704828, EECS-1056028, CNS-1302435, CCF-1116955]
  2. CRII [1657420]
  3. School of Operations Research and Information Engineering, Cornell University
  4. USDOT UTC-D-STOP Center, UT-Austin
  5. Directorate For Engineering
  6. Div Of Electrical, Commun & Cyber Sys [1056028] Funding Source: National Science Foundation

向作者/读者索取更多资源

We consider the mixed regression problem with two components, under adversarial and stochastic noise. We give a convex optimization formulation that provably recovers the true solution, as well as a nonconvex formulation that works under more general settings and remains tractable. Upper bounds are provided on the recovery errors for both arbitrary noise and stochastic noise models. We also give matching minimax lower bounds (up to log factors), showing that our algorithm is information-theoretical optimal in a precise sense. Our results represent the first tractable algorithm guaranteeing successful recovery with tight bounds on recovery errors and sample complexity. Moreover, we pinpoint the statistical cost of mixtures: our minimax-optimal results indicate that the mixture poses a fundamentally more difficult problem in the low-SNR regime, where the learning rate changes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据