Long-term network-wide accurate traffic speed forecasting is one of the most challenging tasks . Most proposed models treat it as a spatiotemporal graph modeling problem and use Graph Convolution Network (GCN) based methods . This paper proposes a novel model, Traffic Transformer, for spatial-temporal graph models and long-termtraffic forecasting . Transformer is the mostpopular framework in Natural Language Processing (NLP) and by adapting it to the spatiotmporal problem, it can extract features through data dynamically by multi-head attention and fuse these features for trafficforecasting. Furthermore, analyzing the attention weight matrixes can find theinfluential part of road networks, allowing us to learn the traffic networks better. Experimental results on the public traffic network datasets and real-world traffic networks demonstrate our proposed model achieves better performance than the state-of-the-art ones. We demonstrate our proposals achieve better performance .

Author(s) : Haoyang Yan, Xiaolei Ma

Links : PDF - Abstract

Code :

https://github.com/Legend94rz/spatial-transformer


Coursera

Keywords : traffic - transformer - network - features - networks -

Leave a Reply

Your email address will not be published. Required fields are marked *