4.5 Article

Statistical Optimization in High Dimensions

期刊

OPERATIONS RESEARCH
卷 64, 期 4, 页码 958-979

出版社

INFORMS
DOI: 10.1287/opre.2016.1504

关键词

programming; stochastic; statistics; nonparametric

资金

  1. Ministry of Education of Singapore through AcRF Tier 2 [R265000443112]
  2. Agency for Science, Technology, and Research (A*STAR) of Singapore through SERC PSF [R266000101305]
  3. National Science Foundation [1056028, 1302435, 1116955]
  4. U.S. DoT through the Data-Supported Transportation Operations and Planning (D-STOP) Tier 1 University Transportation Center
  5. Israel Science Foundation [920/12]
  6. Direct For Computer & Info Scie & Enginr
  7. Division of Computing and Communication Foundations [1116955, 1302435] Funding Source: National Science Foundation
  8. Div Of Electrical, Commun & Cyber Sys
  9. Directorate For Engineering [1056028] Funding Source: National Science Foundation

向作者/读者索取更多资源

We consider optimization problems whose parameters are known only approximately, based on noisy samples. In large-scale applications, the number of samples one can collect is typically of the same order of (or even less than) the dimensionality of the problem. This so-called high-dimensional statistical regime has been the object of intense recent research in machine learning and statistics, primarily due to phenomena inherent to this regime, such as the fact that the noise one sees here often dwarfs the magnitude of the signal itself. While relevant in numerous important operations research and engineering optimization applications, this setup falls far outside the traditional scope of robust and stochastic optimization. We propose three algorithms to address this setting, combining ideas from statistics, machine learning, and robust optimization. Our algorithms are motivated by three natural optimization objectives: minimizing the number of grossly violated constraints; maximizing the number of exactly satisfied constraints; and, finally, developing algorithms whose running time scales with the intrinsic dimension of a problem, as opposed to its observed dimension-a mismatch that, as we discuss in detail, can be dire in settings where constraints are meant to describe preferences of behaviors. The key ingredients of our algorithms are dimensionality reduction techniques from machine learning, robust optimization, and concentration of measure tools from statistics.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据