4.5 Article

The Information Geometry of Mirror Descent

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 61, 期 3, 页码 1451-1457

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2015.2388583

关键词

Online learning; information geometry; differential geometry; natural gradient; mirror descent

资金

  1. NSF through the Statistical and Applied Mathematical Sciences Institute [DMS-1127914]
  2. NSF [CCF-1049290]
  3. NIH through the Systems Biology Project [5P50-GM081883]
  4. AFOSR [FA9550-10-1-0436]
  5. Direct For Mathematical & Physical Scien
  6. Division Of Mathematical Sciences [1209155] Funding Source: National Science Foundation

向作者/读者索取更多资源

We prove the equivalence of two online learning algorithms: 1) mirror descent and 2) natural gradient descent. Both mirror descent and natural gradient descent are generalizations of online gradient descent when the parameter of interest lies on a non-Euclidean manifold. Natural gradient descent selects the steepest descent along a Riemannian manifold by multiplying the standard gradient by the inverse of the metric tensor. Mirror descent induces non-Euclidean structure by solving iterative optimization problems using different proximity functions. In this paper, we prove that mirror descent induced by Bregman divergence proximity functions is equivalent to the natural gradient descent algorithm on the dual Riemannian manifold. We use techniques from convex analysis and connections between Riemannian manifolds, Bregman divergences, and convexity to prove this result. This equivalence between natural gradient descent and mirror descent, implies that: 1) mirror descent is the steepest descent direction along the Riemannian manifold corresponding to the choice of Bregman divergence and 2) mirror descent with log-likelihood loss applied to parameter estimation in exponential families asymptotically achieves the classical Cramer-Rao lower bound.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据