4.5 Article

Convergence of Batch Gradient Method for Training of Pi-Sigma Neural Network with Regularizer and Adaptive Momentum Term

Journal

NEURAL PROCESSING LETTERS
Volume 55, Issue 4, Pages 4871-4888

Publisher

SPRINGER
DOI: 10.1007/s11063-022-11069-0

Keywords

Batch gradient method; Pi-sigma neural network; Regularizer; Momentum term; Convergence

Ask authors/readers for more resources

This paper introduces the characteristics and challenges of Pi-sigma neural network (PSNN) and proposes an improved sparse-response feed-forward algorithm. The algorithm uses an adaptive momentum term and a group lasso regularizer, which enables fast convergence and obtains sparse and efficient neural networks.
Pi-sigma neural network (PSNN) is a class of high order feed-forward neural networks with product units in the output layer, which leads to its fast convergence speed and a high degree of nonlinear mapping capability. Inspired by the sparse response character of human neuron system, this paper investigates the sparse-response feed-forward algorithm, i.e. the gradient descent based high order algorithm with self-adaptive momentum term and group lasso regularizer for training PSNN inference models. In this paper, we mainly focus on two challenging tasks. First, since the general group lasso regularizer is no differentiable at the original point, which will lead to the oscillation of the error function and the norm gradient during the training. A key point of this paper is to modify the usual group lasso regularization term by smoothing it at the origin. The advantage of this processing is that sparse and efficient neural networks can be obtained, and the theoretical analysis of the algorithm can also be obtained. Second, the adaptive momentum term is introduced in the iteration process to further accelerate the network learning speed. In addition, the numerical experiments show that the proposed algorithm eliminates the oscillation and increases learning rate in computation. And the convergence of the algorithm is also verified.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available