It's About Time: Incorporating Temporality in Retrieval Augmented Language Models
Publication Date
1-1-2025
Document Type
Conference Proceeding
Publication Title
Proceedings 2025 IEEE Conference on Artificial Intelligence Cai 2025
DOI
10.1109/CAI64502.2025.00019
First Page
75
Last Page
82
Abstract
In this paper, we propose and evaluate TempRALM, a temporally aware retrieval-augmented language model with few-shot learning capabilities, which considers both the semantic and temporal relevance of retrieved documents in relation to a given query, rather than relying on semantic similarity alone. Our approach demonstrates up to 7 4 % improvement in performance over the baseline state-of-the-art retrieval-augmented language model ATLAS, and 32% improvement over a state-of-the-art commercial large language model augmented with retrieval. TempRALM achieves these improvements without requiring model pre-training, document index replacement, or other computationally intensive operations. Additionally, we introduce and evaluate TablePedia, a novel automated method for generating ground truth data for retrieval-augmented language models and temporal question-answering.
Keywords
Information Retrieval, Retrieval, Temporality
Department
Computer Engineering
Recommended Citation
Anoushka Gade, Jorjeta G. Jetcheva, and Hardi Trivedi. "It's About Time: Incorporating Temporality in Retrieval Augmented Language Models" Proceedings 2025 IEEE Conference on Artificial Intelligence Cai 2025 (2025): 75-82. https://doi.org/10.1109/CAI64502.2025.00019