Authors: Ling Cai*, University of California
Topics: Transportation Geography
Keywords: Transformer, Traffic Prediction, Temporal Continuity and Periodicity, Graph Convolutional Networks
Session Type: Paper
Presentation File: No File Uploaded
Traffic prediction is a challenging problem due to the large size of spatial data and the complexity for modeling spatiotemporal dependencies. Recently, numerous hybrid deep learning-based models have been exploited to jointly capture spatial and temporal dependencies. A typical paradigm is to use CNN/GNN to model spatial dependency and use RNN to learn temporal dependency. However, RNN only considers the relative positions of elements within the input and output sequence pair, while failing to consider the global continuity of time among different sequence pairs. Additionally, the periodicity of time is also not captured in RNN models. In this work, we proposed a novel deep learning architecture, called Traffic Transformer, which employs GNN to model spatial interactions and the Transformer with modified position encoding methods to capture temporal continuity and periodicity. Experiments are conducted on two real-world traffic data sets. Results show that our model can outperform state-of-the-art models by large margins.