BARTNet: Context-Aware Deep Learning Framework for BART Ridership Forecasting

Publication Date

1-1-2025

Document Type

Conference Proceeding

Publication Title

Proceedings 2025 IEEE Conference on Artificial Intelligence Cai 2025

DOI

10.1109/CAI64502.2025.00147

First Page

825

Last Page

828

Abstract

In this paper, we propose a novel context-aware deep learning-based framework, BARTNet, for forecasting Bay Area Rapid Transit (BART) ridership by integrating contextual and temporal data. Traditional statistical methods, such as Autoregressive Integrated Moving Average (ARIMA) and Simple Moving Averages (SMA), fail to capture complex non-linear dependencies and external factors influencing ridership. To overcome these limitations and improve ridership forecasting effectiveness, BARTNet employs advanced deep learning architectures, including Long Short-Term Memory (LSTM), Bidirectional Long Short-Term Memory (BiLSTM), Temporal Convolutional Network (TCN), the Transformer model, and Bidirectional Encoder Representations from Transformers (BERT), on an integrated dataset combining contextual and temporal features. Additionally, robust feature selection was performed to identify the most relevant contextual features before integration. We evaluated the performance of our proposed models using Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Root Mean Squared Error (RMSE). The experimental results demonstrate that BARTNet significantly outperformed ARIMA, showcasing its potential for accurate and reliable ridership forecasting.

Keywords

ARIMA, BERT, LSTM, Ridership Forecasting, Transformers

Department

Applied Data Science

Share

COinS