4.7 Article

2F-TP: Learning Flexible Spatiotemporal Dependency for Flexible Traffic Prediction

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TITS.2022.3146899

Keywords

Roads; Predictive models; Adaptation models; Recurrent neural networks; Data models; Spatiotemporal phenomena; Logic gates; Spatiotemproal dependency; traffic prediction; graph convolutional networks; attention mechanism

Funding

  1. National Natural Science Foundation of China [61872050, 62172066, U2013202, 61922053]

Ask authors/readers for more resources

Accurate traffic prediction is crucial in Intelligent Transportation Systems, and a novel Dual Graph Gated Recurrent Neural Network (DG(2)RNN) is proposed to model various dependencies and provide flexible predictions for future traffic flow. The model outperforms state-of-the-arts in real-world traffic datasets, showing stable performance for flexible prediction with varying horizons.
Accurate traffic prediction is a critical yet challenging task in Intelligent Transportation Systems, benefiting a variety of smart services, e.g., route planning and traffic management. Although extensive efforts have been devoted to this problem, it is still not well solved due to the flexible dependency within traffic data along both spatial and temporal dimensions. In this paper, we explore the flexibility from three aspects, namely the time-varying local spatial dependency, the dynamic temporal dependency, and the global spatial dependency. Then we propose a novel Dual Graph Gated Recurrent Neural Network (DG(2)RNN) to effectively model all these dependencies and offer flexible (multi-step) predictions for future traffic flow. Specifically, we design a Dual Graph Convolution Module to capture the local spatial dependency from two perspectives, namely road distance and adaptive correlation. To model the dynamic temporal dependency, we firstly develop a Bidirectional Gated Recurrent Layer to capture the forward and backward sequential contexts of historical traffic flow, then combine the derived hidden states with their various contributions learned by a temporal attention mechanism. Besides, we further design a spatial attention mechanism to learn the latent global spatial dependency among all locations to facilitate the prediction. Extensive experiments on three types of real-world traffic datasets demonstrate that our model outperforms state-of-the-arts. Results also show our model has more stable performance for the flexible prediction with varying prediction horizons.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available