4.6 Article

A convergent incremental gradient method with a constant step size

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 18, 期 1, 页码 29-51

出版社

SIAM PUBLICATIONS
DOI: 10.1137/040615961

关键词

incremental gradient method; convergence analysis; sensor networks; neural networks; logistic regression; boosting

向作者/读者索取更多资源

An incremental aggregated gradient method for minimizing a sum of continuously differentiable functions is presented. The method requires a single gradient evaluation per iteration and uses a constant step size. For the case that the gradient is bounded and Lipschitz continuous, we show that the method visits infinitely often regions in which the gradient is small. Under certain unimodality assumptions, global convergence is established. In the quadratic case, a global linear rate of convergence is shown. The method is applied to distributed optimization problems arising in wireless sensor networks, and numerical experiments compare the new method with other incremental gradient methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据