4.6 Article

Comparison of new activation functions in neural network for forecasting financial time series

期刊

NEURAL COMPUTING & APPLICATIONS
卷 20, 期 3, 页码 417-439

出版社

SPRINGER LONDON LTD
DOI: 10.1007/s00521-010-0407-3

关键词

Neural networks; Activation functions; Complementary log-log; Probit; Log-log; CGF algorithm; LM algorithm

资金

  1. CNPq
  2. FACEPE

向作者/读者索取更多资源

In artificial neural networks (ANNs), the activation function most used in practice are the logistic sigmoid function and the hyperbolic tangent function. The activation functions used in ANNs have been said to play an important role in the convergence of the learning algorithms. In this paper, we evaluate the use of different activation functions and suggest the use of three new simple functions, complementary log-log, probit and log-log, as activation functions in order to improve the performance of neural networks. Financial time series were used to evaluate the performance of ANNs models using these new activation functions and to compare their performance with some activation functions existing in the literature. This evaluation is performed through two learning algorithms: conjugate gradient backpropagation with Fletcher-Reeves updates and Levenberg-Marquardt.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据