4.6 Article

An Improvement of Stochastic Gradient Descent Approach for Mean-Variance Portfolio Optimization Problem

期刊

JOURNAL OF MATHEMATICS
卷 2021, 期 -, 页码 -

出版社

HINDAWI LTD
DOI: 10.1155/2021/8892636

关键词

-

资金

  1. Ministry of Education Malaysia (MOE) [FRGS/1/2018/STG06/UTHM/02/5]
  2. Universiti Tun Hussein Onn Malaysia

向作者/读者索取更多资源

The improved Adam algorithm (AdamSE algorithm) presented in this paper shows a smaller number of iterations and faster convergence rate when solving the portfolio optimization problem.
In this paper, the current variant technique of the stochastic gradient descent (SGD) approach, namely, the adaptive moment estimation (Adam) approach, is improved by adding the standard error in the updating rule. The aim is to fasten the convergence rate of the Adam algorithm. This improvement is termed as Adam with standard error (AdamSE) algorithm. On the other hand, the mean-variance portfolio optimization model is formulated from the historical data of the rate of return of the S&P 500 stock, 10-year Treasury bond, and money market. The application of SGD, Adam, adaptive moment estimation with maximum (AdaMax), Nesterov-accelerated adaptive moment estimation (Nadam), AMSGrad, and AdamSE algorithms to solve the mean-variance portfolio optimization problem is further investigated. During the calculation procedure, the iterative solution converges to the optimal portfolio solution. It is noticed that the AdamSE algorithm has the smallest iteration number. The results show that the rate of convergence of the Adam algorithm is significantly enhanced by using the AdamSE algorithm. In conclusion, the efficiency of the improved Adam algorithm using the standard error has been expressed. Furthermore, the applicability of SGD, Adam, AdaMax, Nadam, AMSGrad, and AdamSE algorithms in solving the mean-variance portfolio optimization problem is validated.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据