4.6 Article

Study on fast speed fractional order gradient descent method and its application in neural networks

期刊

NEUROCOMPUTING
卷 489, 期 -, 页码 366-376

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2022.02.034

关键词

Gradient descent method; Optimization; Fractional order calculus; Particle swarm optimization; Neural networks

资金

  1. National Natural Science Foundation of China [61973291]
  2. fund of China Scholarship Council [201806345002]

向作者/读者索取更多资源

This article introduces a novel fractional order gradient descent method for quadratic loss function. It proposes a more practical method based on the Riemann-Liouville definition and introduces random weight particle swarm optimization algorithm to improve convergence speed and global convergence ability.
This article introduces a novel fractional order gradient descent method for the quadratic loss function. Based on Riemann-Liouville definition, a more practical fractional order gradient descent method with variable initial value is proposed to ensure convergence to the actual extremum. On this basis, the random weight particle swarm optimization algorithm is introduced to select the appropriate initial value, which not only accelerates the convergence speed, but also enhances the global convergence ability of the algorithm. To avoid complicated problems of the chain rule in fractional calculus, the parameters of output layers is trained by the new designed method, while the parameters of hidden layers still use the conventional method. By selecting proper hyper-parameters, the proposed method shows faster convergence speed than others. Finally, numerical examples are given to verify that the proposed algorithm has fast convergence speed and high accuracy under a adequate large number of independent runs. CO 2022 Published by Elsevier B.V.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据