4.7 Article

Novel hybrid multi-head self-attention and multifractal algorithm for non-stationary time series prediction

Journal

INFORMATION SCIENCES
Volume 613, Issue -, Pages 541-555

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2022.08.126

Keywords

Non-stationary; Time series prediction; Multifractal; Multi-head self-attention; GRU

Funding

  1. National Natural Science Foundation of China (NSFC) [U19B6003, U1911205]
  2. Hubei Province Department of Science and Technology [ZRZY2022KJ07]
  3. Hubei Province Natural Science Foundation [2019CFA023]

Ask authors/readers for more resources

This study proposes a novel dynamic recurrent neural network for stable and robust prediction of non-stationary multivariate time series. By extracting volatility characteristics and introducing a self-attention mechanism, the proposed model outperforms traditional methods in experiments.
Traditional time series prediction methods have shown their outstanding capabilities in time series prediction. However, due to essential differences in volatility characteristics among diverse types of non-stationary multivariate time series (NSMTS), it is difficult for traditional methods to maintain robust prediction performance. This study proposes a novel dynamic recurrent neural network to achieve stable and robust prediction perfor-mance. First, a multifractal gated recurrent unit (MF-GRU) based on the multifractal method is proposed to extract volatility characteristics. Meanwhile, to strengthen the parameters of the historical hidden layer state that has a more significant impact on the output, a self-attention mechanism is introduced into the MF-GRU, leading to a multifrac-tal gated recurrent unit multi-head self-attention model. The efficiency of the proposed model was verified on public datasets. The experimental results show that the proposed model outperforms the traditional methods, such as long short-term memory (LSTM), the gated recurrent unit (GRU), and the minimal gated unit (MGU). etc.(c) 2022 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available