4.2 Review

REVIEW AND ANALYSIS OF HIDDEN NEURON NUMBER EFFECT OF SHALLOW BACKPROPAGATION NEURAL NETWORKS

期刊

NEURAL NETWORK WORLD
卷 30, 期 2, 页码 97-112

出版社

ACAD SCIENCES CZECH REPUBLIC, INST COMPUTER SCIENCE
DOI: 10.14311/NNW.2020.30.008

关键词

backpropagation neural networks; hidden neuron number; training patterns; total processing elements

资金

  1. Chinese Academy of Sciences' Institute of Automation (CASIA)

向作者/读者索取更多资源

Shallow neural network implementations are still popular for real-life classification problems that require rapid achievements with limited data. Parameters selection such as hidden neuron number, learning rate and momentum factor of neural networks are the main challenges that causes time loss during these implementations. In these parameters, the determination of hidden neuron numbers is the main drawback that affects both training and generalization phases of any neural system for learning efficiency and system accuracy. In this study, several experiments are performed in order to observe the effect of hidden neuron number of 3-layered backpropagation neural network on the generalization rate of classification problems using both numerical datasets and image databases. Experiments are performed by considering the increasing number of total processing elements, and various numbers of hidden neurons are used during the training. The results of each hidden neuron number are analyzed according to the accuracy rates and iteration numbers during the convergence. Results show that the effect of the hidden neuron numbers mainly depends on the number of training patterns. Also obtained results suggest intervals of hidden neuron numbers for different number of total processing elements and training patterns.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据