4.2 Article

Mirror descent and nonlinear projected subgradient methods for convex optimization

期刊

OPERATIONS RESEARCH LETTERS
卷 31, 期 3, 页码 167-175

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/S0167-6377(02)00231-6

关键词

nonsmooth convex minimization; projected subgradient methods; nonlinear projections; mirror descent algorithms; relative entropy; complexity analysis; global rate of convergence

向作者/读者索取更多资源

The mirror descent algorithm (MDA) was introduced by Nemirovsky and Yudin for solving convex optimization problems. This method exhibits an efficiency estimate that is mildly dependent in the decision variables dimension, and thus suitable for solving very large scale optimization problems. We present a new derivation and analysis of this algorithm. We show that the MDA can be viewed as a nonlinear projected-subgradient type method. derived from using a general distance-like function instead of the usual Euclidean squared distance. Within this interpretation, we derive in a simple way convergence and efficiency estimates. We then propose an Entropic mirror descent algorithm for convex minimization over the unit simplex, with a global efficiency estimate proven to be mildly dependent in the dimension of the problem, (C) 2003 Elsevier Science B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据