Journal
NEURAL COMPUTING & APPLICATIONS
Volume 20, Issue 3, Pages 417-439Publisher
SPRINGER LONDON LTD
DOI: 10.1007/s00521-010-0407-3
Keywords
Neural networks; Activation functions; Complementary log-log; Probit; Log-log; CGF algorithm; LM algorithm
Categories
Funding
- CNPq
- FACEPE
Ask authors/readers for more resources
In artificial neural networks (ANNs), the activation function most used in practice are the logistic sigmoid function and the hyperbolic tangent function. The activation functions used in ANNs have been said to play an important role in the convergence of the learning algorithms. In this paper, we evaluate the use of different activation functions and suggest the use of three new simple functions, complementary log-log, probit and log-log, as activation functions in order to improve the performance of neural networks. Financial time series were used to evaluate the performance of ANNs models using these new activation functions and to compare their performance with some activation functions existing in the literature. This evaluation is performed through two learning algorithms: conjugate gradient backpropagation with Fletcher-Reeves updates and Levenberg-Marquardt.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available