期刊
NEURAL COMPUTING & APPLICATIONS
卷 20, 期 3, 页码 417-439出版社
SPRINGER LONDON LTD
DOI: 10.1007/s00521-010-0407-3
关键词
Neural networks; Activation functions; Complementary log-log; Probit; Log-log; CGF algorithm; LM algorithm
资金
- CNPq
- FACEPE
In artificial neural networks (ANNs), the activation function most used in practice are the logistic sigmoid function and the hyperbolic tangent function. The activation functions used in ANNs have been said to play an important role in the convergence of the learning algorithms. In this paper, we evaluate the use of different activation functions and suggest the use of three new simple functions, complementary log-log, probit and log-log, as activation functions in order to improve the performance of neural networks. Financial time series were used to evaluate the performance of ANNs models using these new activation functions and to compare their performance with some activation functions existing in the literature. This evaluation is performed through two learning algorithms: conjugate gradient backpropagation with Fletcher-Reeves updates and Levenberg-Marquardt.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据