Journal
NUMERICAL ALGORITHMS
Volume 42, Issue 1, Pages 63-73Publisher
SPRINGER
DOI: 10.1007/s11075-006-9023-9
Keywords
acceleration methods; backtracking; gradient descent methods
Categories
Ask authors/readers for more resources
In this paper we introduce an acceleration of gradient descent algorithm with backtracking. The idea is to modify the steplength t(k) by means of a positive parameter theta(k) , in a multiplicative manner, in such a way to improve the behaviour of the classical gradient algorithm. It is shown that the resulting algorithm remains linear convergent, but the reduction in function value is significantly improved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available