4.6 Article

Self-organizing radial basis function neural network using accelerated second-order learning algorithm

期刊

NEUROCOMPUTING
卷 469, 期 -, 页码 1-12

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2021.10.065

关键词

Self-organizing radial basis function neural network (SORBFNN); Adaptive expansion and pruning mechanism (AEPM) of gradient space; Accelerated second-order learning (ASOL)algorithm

向作者/读者索取更多资源

An accelerated second-order learning (ASOL) algorithm is proposed to train RBFNN, which reduces vanishing gradient, simplifies structure, and improves generalization ability through adaptive expansion and pruning mechanism. The theoretical analysis and experimental results demonstrate that ASOL-SORBFNN performs well in terms of learning speed and prediction accuracy.
Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, it is still difficult to avoid vanishing gradient to improve the learning performance in the training process. For this reason, in this paper, an accelerated second-order learning (ASOL) algorithm is developed to train RBFNN. First, an adaptive expansion and pruning mechanism (AEPM) of gradient space, based on the integrity and orthogonality of hidden neurons, is designed. Then, the effective gradient information is constantly added to gradient space and the redundant gradient information is eliminated from gradient space. Second, with AEPM, the neurons are generated or pruned accordingly. In this way, a self-organizing RBFNN (SORBFNN) which reduces the structure complexity and improves the generalization ability is obtained. Then, the structure and parameters in the learning process can be optimized by the proposed ASOL-based SORBFNN (ASOL-SORBFNN). Third, some theoretical analyses including the efficiency of the proposed AEPM on avoiding the vanishing gradient and the stability of SORBFNN in the process of structural adjustment are given, then the successful application of the proposed ASOL-SORBFNN is guaranteed. Finally, to illustrate the advantages of the proposed ASOL-SORBFNN, several experimental studies are examined. By comparing with other existing approaches, the results show that ASOLSORBFNN performs well in terms of both learning speed and prediction accuracy. (c) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据