期刊
NEUROCOMPUTING
卷 277, 期 -, 页码 29-37出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2016.12.111
关键词
Stochastic resonance; Neural networks
资金
- JSPS KAKENHI Grant [JP26700025]
- Grants-in-Aid for Scientific Research [16KT0015, 17H05908] Funding Source: KAKEN
Stochastic resonance (SR) is a phenomenon by which the input signal of a nonlinear system, with magnitude too small to affect the output, becomes observable by adding a non-zero level of noise to the system. SR is known to assist biological beings in coping with noisy environments, providing sophisticated information processing and adaptive behaviors. The SR effect can be interpreted as a decrease in the input-output information loss of a nonlinear system by making it stochastically closer to a linear system. This work shows how SR can improve the performance of a system even when the desired input-output relationship is nonlinear, specifically for the case of a neural networks whose hidden layers consist of threshold functions. Universal approximation capability of neural networks exploiting SR is then discussed: although a network consisting of threshold activation functions has been proven to be an universal approximator in the context of the extreme learning machine (ELM), once SR is taken into account, the system can be deemed as a classic three-layer neural network whose universality has been previously proven by simpler proofs. After proving the universal approximation capability for an infinite number of hidden units, the performance achieved with a finite number of hidden units is evaluated using two training algorithms, namely backpropagation and ELM. Results highlight the SR effect occurring in the proposed system, and the relationship among the number of hidden units, noise intensity, and approximation performance. (C) 2017 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据