4.1 Article

A Growing and Pruning Method for Radial Basis Function Networks

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 20, 期 6, 页码 1039-1045

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2009.2019270

关键词

Gaussian mixture model (GMM); growing and pruning algorithms; radial basis function (RBF) neural networks; resource-allocating network (RAN); sequential function approximation

向作者/读者索取更多资源

A recently published generalized growing and pruning (GGAP) training algorithm for radial basis function (RBF) neural networks is studied and modified. GGAP is a resource-allocating network (RAN) algorithm, which means that a created network unit that consistently makes little contribution to the network's performance can be removed during the training. GGAP states a formula for computing the significance of the network units, which requires a d-fold numerical integration for arbitrary probability density function p(x) of the input data x(x is an element of R-d). In this work, the GGAP formula is approximated using a Gaussian mixture model (GMM) for p(x) and an analytical solution of the approximated unit significance is derived. This makes it possible to employ the modified GGAP for input data having complex and high-dimensional p(x), which was not possible in the original GGAP. The results of an extensive experimental study show that the modified algorithm outperforms the original GGAP achieving both a lower prediction error and reduced complexity of the trained network.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据