Publication Date
Spring 2025
Degree Type
Master's Project
Degree Name
Master of Science in Computer Science (MSCS)
Department
Computer Science
First Advisor
Saptarshi Sengupta
Second Advisor
William Andreopoulos
Third Advisor
Thomas Austin
Keywords
Time series forecasting, transformer models, dynamic attention, adversarial robustness, FGSM attack, BIM attack
Abstract
Transformer architectures have emerged as powerful tools for time series forecasting, excelling at capturing complex temporal dependencies across multivariate inputs. However, these models are highly susceptible to adversarial attacks such as the Fast Gradient Sign Method (FGSM) and Basic Iterative Method (BIM), which can significantly degrade predictive performance through small, targeted perturbations. This work integrates dynamic attention mechanisms, adaptive masking modules that introduce controlled variability into attention pathways, into a transformer forecasting model to enhance robustness against such attacks. Using two distinct datasets, we compare the performance of a standard transformer and a dynamic attention-enhanced transformer under both clean and adversarial conditions. Results show that while both models perform similarly on non-attacked data, the dynamic attention model consistently maintains lower error rates under increasing adversarial intensities, demonstrating improved resilience without the need for adversarial training or additional defense layers. These findings highlight the effectiveness of dynamic architectural defenses as lightweight, model-level strategies for improving the robustness and reliability of deep learning systems in time series forecasting applications.
Recommended Citation
Patel, Kush, "Transformers in Time-Series Forecasting: Enhancing Robustness via Dynamic Attention Mechanisms" (2025). Master's Projects. 1520.
DOI: https://doi.org/10.31979/etd.tebp-5gr2
https://scholarworks.sjsu.edu/etd_projects/1520