期刊
SIAM JOURNAL ON OPTIMIZATION
卷 18, 期 1, 页码 29-51出版社
SIAM PUBLICATIONS
DOI: 10.1137/040615961
关键词
incremental gradient method; convergence analysis; sensor networks; neural networks; logistic regression; boosting
An incremental aggregated gradient method for minimizing a sum of continuously differentiable functions is presented. The method requires a single gradient evaluation per iteration and uses a constant step size. For the case that the gradient is bounded and Lipschitz continuous, we show that the method visits infinitely often regions in which the gradient is small. Under certain unimodality assumptions, global convergence is established. In the quadratic case, a global linear rate of convergence is shown. The method is applied to distributed optimization problems arising in wireless sensor networks, and numerical experiments compare the new method with other incremental gradient methods.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据