In order to join virtual sessions, you must be registered and logged-in(Were you registered for the in-person meeting in Denver? if yes, just log in.) 
Note: All session times are in Mountain Daylight Time.

TrafficTransformer: Capturing the Continuity and Periodicity of time

Authors: Ling Cai*, University of California
Topics: Transportation Geography
Keywords: Transformer, Traffic Prediction, Temporal Continuity and Periodicity, Graph Convolutional Networks
Session Type: Paper
Presentation File: No File Uploaded


Traffic prediction is a challenging problem due to the large size of spatial data and the complexity for modeling spatiotemporal dependencies. Recently, numerous hybrid deep learning-based models have been exploited to jointly capture spatial and temporal dependencies. A typical paradigm is to use CNN/GNN to model spatial dependency and use RNN to learn temporal dependency. However, RNN only considers the relative positions of elements within the input and output sequence pair, while failing to consider the global continuity of time among different sequence pairs. Additionally, the periodicity of time is also not captured in RNN models. In this work, we proposed a novel deep learning architecture, called Traffic Transformer, which employs GNN to model spatial interactions and the Transformer with modified position encoding methods to capture temporal continuity and periodicity. Experiments are conducted on two real-world traffic data sets. Results show that our model can outperform state-of-the-art models by large margins.

Abstract Information

This abstract is already part of a session. View the session here.

To access contact information login