4.7 Article

Variable three-term conjugate gradient method for training artificial neural networks

Journal

NEURAL NETWORKS
Volume 159, Issue -, Pages 125-136

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2022.12.001

Keywords

Three-term conjugate gradient; Variable step size; Artificial neural networks; Image classification and generation; Intelligent robotic grasping

Ask authors/readers for more resources

This paper proposes a variable three-term conjugate gradient (VTTCG) method to enhance the search direction and achieve improved convergence stability in the training of artificial neural networks. Experimental results show that the performance of the VTTCG method is superior to that of four conventional methods, including SGD, Adam, AMSGrad, and AdaBelief.
Artificial neural networks (ANNs) have been widely adopted as general computational tools both in computer science as well as many other engineering fields. Stochastic gradient descent (SGD) and adaptive methods such as Adam are popular as robust optimization algorithms used to train the ANNs. However, the effectiveness of these algorithms is limited because they calculate a search direction based on a first-order gradient. Although higher-order gradient methods such as Newton's method have been proposed, they require the Hessian matrix to be semi-definite, and its inversion incurs a high computational cost. Therefore, in this paper, we propose a variable three-term conjugate gradient (VTTCG) method that approximates the Hessian matrix to enhance search direction and uses a variable step size to achieve improved convergence stability. To evaluate the performance of the VTTCG method, we train different ANNs on benchmark image classification and generation datasets. We also conduct a similar experiment in which a grasp generation and selection convolutional neural network (GGS-CNN) is trained to perform intelligent robotic grasping. After considering a simulated environment, we also test the GGS-CNN with a physical grasping robot. The experimental results show that the performance of the VTTCG method is superior to that of four conventional methods, including SGD, Adam, AMSGrad, and AdaBelief.(c) 2022 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available