4.6 Article

Study on fast speed fractional order gradient descent method and its application in neural networks

Journal

NEUROCOMPUTING
Volume 489, Issue -, Pages 366-376

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2022.02.034

Keywords

Gradient descent method; Optimization; Fractional order calculus; Particle swarm optimization; Neural networks

Funding

  1. National Natural Science Foundation of China [61973291]
  2. fund of China Scholarship Council [201806345002]

Ask authors/readers for more resources

This article introduces a novel fractional order gradient descent method for quadratic loss function. It proposes a more practical method based on the Riemann-Liouville definition and introduces random weight particle swarm optimization algorithm to improve convergence speed and global convergence ability.
This article introduces a novel fractional order gradient descent method for the quadratic loss function. Based on Riemann-Liouville definition, a more practical fractional order gradient descent method with variable initial value is proposed to ensure convergence to the actual extremum. On this basis, the random weight particle swarm optimization algorithm is introduced to select the appropriate initial value, which not only accelerates the convergence speed, but also enhances the global convergence ability of the algorithm. To avoid complicated problems of the chain rule in fractional calculus, the parameters of output layers is trained by the new designed method, while the parameters of hidden layers still use the conventional method. By selecting proper hyper-parameters, the proposed method shows faster convergence speed than others. Finally, numerical examples are given to verify that the proposed algorithm has fast convergence speed and high accuracy under a adequate large number of independent runs. CO 2022 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available