期刊
NEUROCOMPUTING
卷 332, 期 -, 页码 320-327出版社
ELSEVIER
DOI: 10.1016/j.neucom.2018.12.016
关键词
Short-term traffic flow prediction; Noise processing; LSTM feature enhancement; Attention mechanism
资金
- Natural Science Foundation of China [61602407, 61472363, 61572436]
- Natural Science Foundation of Zhejiang Province [LY18F010007]
- Opening Foundation of Engineering Research Center of Intelligent Transport of Zhejiang Province [2016ERCITZJ-KF02]
Long short-term memory (LSTM) is widely used to process and predict events with time series, but it is difficult to solve exceedingly long-term dependencies, possibly because the LSTM errors increase as the sequence length increases. Recently, researchers have noted that adding features on multiple time scales can help improve the long-term dependency of the RNN, which is inspired by the attention mechanism, considering the need for historical data in traffic flow prediction. We propose an improved approach that connects the high-impact value of remarkably long sequence time steps to the current time step, and these high-impact traffic flow values are captured using the attention mechanism. At the same time, we smooth out some data beyond the normal range to obtain better prediction results. The experimental results show that the proposed prediction model has certain competitiveness in short-term traffic flow predictions. (C) 2018 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据