With the rapid growth of worldwide urbanization, the increasing demand for public transportation is indispensable. To improve the service quality, predicting the flow of passengers is important for the transport… Click to show full abstract
With the rapid growth of worldwide urbanization, the increasing demand for public transportation is indispensable. To improve the service quality, predicting the flow of passengers is important for the transport operators. Information on density of passengers can be used as early warnings of overcrowding and to determine if additional fleet is required. However, passenger flow forecasting is a challenging task, as it is affected by many complex factors such as spatial dependencies, temporal dependencies, and external influences. Furthermore, the ability to learn the long-term dependency of the data is also crucial, as the distant past flow information contributes to the flow over time. Most of the existing studies struggle to solve this issue, especially to learn the long-term dependency of the data, as they rely heavily on the raw handcrafted features and require high memory bandwidth to compute. To address these issues, we propose a Selective Feedback Transformer (SFT) capable of learning long-term dependency efficiently, where the selective feedback mechanism only computes the important feedback from the dominant query-key pairs in the memory. Experimental results demonstrate that the proposed model outperforms all the benchmarked methods by 27% - 37% in terms of RMSE and 36% - 50% in terms of MAE. Additionally, when the proposed model is tested with shallowed model (less number of decoding layer), it exhibits a substantial improvement of 14% - 57% in the training time and 15% - 46% in the inference time, with minimal impact on the accuracies.
               
Click one of the above tabs to view related content.