4.7 Article

An Improved Modification of Accelerated Double Direction and Double Step-Size Optimization Schemes

Journal

MATHEMATICS
Volume 10, Issue 2, Pages -

Publisher

MDPI
DOI: 10.3390/math10020259

Keywords

gradient descent; line search; gradient descent methods; quasi-Newton method; convergence rate

Categories

Funding

  1. [IJ-0202]

Ask authors/readers for more resources

The proposed improved variant of the accelerated gradient optimization models merges the positive features of different models to define a simpler and more effective iterative method. Convergence analysis shows that the method is at least linearly convergent for uniformly convex and strictly convex functions. Numerical test results confirm the efficiency of the developed model in terms of CPU time, the number of iterations, and function evaluations.
We propose an improved variant of the accelerated gradient optimization models for solving unconstrained minimization problems. Merging the positive features of either double direction, as well as double step size accelerated gradient models, we define an iterative method of a simpler form which is generally more effective. Performed convergence analysis shows that the defined iterative method is at least linearly convergent for uniformly convex and strictly convex functions. Numerical test results confirm the efficiency of the developed model regarding the CPU time, the number of iterations and the number of function evaluations metrics.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available