4.5 Article

Traffic transformer: Capturing the continuity and periodicity of time series for traffic forecasting

期刊

TRANSACTIONS IN GIS
卷 24, 期 3, 页码 736-755

出版社

WILEY
DOI: 10.1111/tgis.12644

关键词

-

资金

  1. NSF award [1936677]
  2. Esri, Inc.
  3. Office of Integrative Activities
  4. Office Of The Director [1936677] Funding Source: National Science Foundation

向作者/读者索取更多资源

Traffic forecasting is a challenging problem due to the complexity of jointly modeling spatio-temporal dependencies at different scales. Recently, several hybrid deep learning models have been developed to capture such dependencies. These approaches typically utilize convolutional neural networks or graph neural networks (GNNs) to model spatial dependency and leverage recurrent neural networks (RNNs) to learn temporal dependency. However, RNNs are only able to capture sequential information in the time series, while being incapable of modeling their periodicity (e.g., weekly patterns). Moreover, RNNs are difficult to parallelize, making training and prediction less efficient. In this work we propose a novel deep learning architecture calledTraffic Transformerto capture the continuity and periodicity of time series and to model spatial dependency. Our work takes inspiration from Google's Transformer framework for machine translation. We conduct extensive experiments on two real-world traffic data sets, and the results demonstrate that our model outperforms baseline models by a substantial margin.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据