4.7 Article

EA-LSTM: Evolutionary attention-based LSTM for time series prediction

期刊

KNOWLEDGE-BASED SYSTEMS
卷 181, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2019.05.028

关键词

Evolutionary computation; Deep neural network; Time series prediction

资金

  1. National Key Research and Development of China [2016YFB0800404]
  2. National Natural Science Foundation of China [61572068, 61532005]
  3. Special Program of Beijing Municipal Science & Technology Commission [Z181100000118002]
  4. Strategic Priority Research Program of Chinese Academy of Science [XDB32030200]
  5. Fundamental Research Funds for the Central Universities of China [2018YJS032]

向作者/读者索取更多资源

Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods. (C) 2019 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据