Journal
SIAM JOURNAL ON OPTIMIZATION
Volume 18, Issue 1, Pages 29-51Publisher
SIAM PUBLICATIONS
DOI: 10.1137/040615961
Keywords
incremental gradient method; convergence analysis; sensor networks; neural networks; logistic regression; boosting
Categories
Ask authors/readers for more resources
An incremental aggregated gradient method for minimizing a sum of continuously differentiable functions is presented. The method requires a single gradient evaluation per iteration and uses a constant step size. For the case that the gradient is bounded and Lipschitz continuous, we show that the method visits infinitely often regions in which the gradient is small. Under certain unimodality assumptions, global convergence is established. In the quadratic case, a global linear rate of convergence is shown. The method is applied to distributed optimization problems arising in wireless sensor networks, and numerical experiments compare the new method with other incremental gradient methods.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available