4.6 Article

Fine-tuned support vector regression model for stock predictions

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 35, Issue 32, Pages 23295-23309

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-021-05842-w

Keywords

Grid search; Machine learning; Root mean square error; Mean absolute percentage error; Support vector regression; Volatility

Ask authors/readers for more resources

This paper proposes a new machine learning technique for stock forecasting based on time series data. By optimizing the parameters through grid search on the training dataset, the model achieves increased accuracy, reduced time and memory requirements, and minimized data overfitting. The proposed method performs better in analyzing performance parameters of the stock market and requires less time compared to similar methods.
In this paper, a new machine learning (ML) technique is proposed that uses the fine-tuned version of support vector regression for stock forecasting of time series data. Grid search technique is applied over training dataset to select the best kernel function and to optimize its parameters. The optimized parameters are validated through validation dataset. Thus, the tuning of this parameters to their optimized value not only increases model's overall accuracy but also requires less time and memory. Further, this also minimizes the model from being data overfitted. The proposed method is used to analysis different performance parameters of stock market like up-to-daily and up-to-monthly return, cumulative monthly return, its volatility nature and the risk associated with it. Eight different large-sized datasets are chosen from different domain, and stock is predicted for each case by using the proposed method. A comparison is carried out among the proposed method and some similar methods of same interest in terms of computed root mean square error and the mean absolute percentage error. The comparison reveals the proposed method to be more accurate in predicting the stocks for the chosen datasets. Further, the proposed method requires much less time than its counterpart methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available