4.1 Article

Neural Network Learning without Backpropagation

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 21, 期 11, 页码 1793-1803

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2010.2073482

关键词

Forward-only computation; Levenberg-Marquardt algorithm; neural network training

向作者/读者索取更多资源

The method introduced in this paper allows for training arbitrarily connected neural networks, therefore, more powerful neural network architectures with connections across layers can be efficiently trained. The proposed method also simplifies neural network training, by using the forward-only computation instead of the traditionally used forward and backward computation. Information needed for the gradient vector (for first-order algorithms) and Jacobian or Hessian matrix (for second-order algorithms) is obtained during forward computation. With the proposed algorithm, it is now possible to solve the same problems using a much smaller number of neurons because the proposed algorithm is able to train more complex neural network architectures that require a smaller number of neurons. Comparison results of computation cost show that the proposed forward-only computation can be faster than the traditional implementation of the Levenberg-Marquardt algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据