4.1 Article

Effects of moving the centers in an RBF network

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 13, 期 6, 页码 1299-1307

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2002.804286

关键词

generalized methods; gradient methods; Hessian matrices; intelligent networks; learning systems; neural-network architecture; nonlinear estimation

向作者/读者索取更多资源

In radial basis function (RBF) networks, placement of centers is said to have a significant,effect on the performance of the network. Supervised learning of center locations in some applications show that they are superior to the networks whose centers are located using unsupervised methods. But such networks can take the same training time as that of sigmoid networks. The increased time needed for supervised learning offsets the training time of regular RBF networks. One way to overcome this may be to train the network with a set of centers selected by unsupervised methods and then to fine tune the locations of centers. This can be done by first evaluating whether moving the centers would decrease the error and then, depending on the required level of accuracy, changing the center locations. This paper provides new results on bounds for the gradient and Hessian of the error considered first as a function of the independent set of parameters, namely the centers, widths, and weights; and then as a function of centers and widths where the linear weights are now functions of the basis function parameters for networks of fixed size. Moreover, bounds for the Hessian are also provided along a line beginning at the initial set of parameters. Using these bounds, it is possible to estimate how much one can reduce the error by changing the centers. further to that, a step size can be specified to achieve a guaranteed amount of reduction in error.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据