4.6 Article

Comparison of new activation functions in neural network for forecasting financial time series

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 20, Issue 3, Pages 417-439

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-010-0407-3

Keywords

Neural networks; Activation functions; Complementary log-log; Probit; Log-log; CGF algorithm; LM algorithm

Funding

  1. CNPq
  2. FACEPE

Ask authors/readers for more resources

In artificial neural networks (ANNs), the activation function most used in practice are the logistic sigmoid function and the hyperbolic tangent function. The activation functions used in ANNs have been said to play an important role in the convergence of the learning algorithms. In this paper, we evaluate the use of different activation functions and suggest the use of three new simple functions, complementary log-log, probit and log-log, as activation functions in order to improve the performance of neural networks. Financial time series were used to evaluate the performance of ANNs models using these new activation functions and to compare their performance with some activation functions existing in the literature. This evaluation is performed through two learning algorithms: conjugate gradient backpropagation with Fletcher-Reeves updates and Levenberg-Marquardt.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available