Publication Date
2-28-2025
Document Type
Article
Publication Title
Transactions of the Association for Computational Linguistics
Volume
13
DOI
10.1162/tacl_a_00736
First Page
200
Last Page
219
Abstract
We study the sequence-to-sequence mapping capacity of transformers by relating them to finite transducers, and find that they can express surprisingly large classes of (total functional) transductions. We do so using variants of RASP, a programming language designed to help people “think like transformers,” as an intermediate representation. We extend the existing Boolean variant B-RASP to sequence-to-sequence transductions and show that it computes exactly the first-order rational transductions (such as string rotation). Then, we introduce two new extensions. B-RASP[pos] enables calculations on positions (such as copying the first half of a string) and contains all first-order regular transductions. S-RASP adds prefix sum, which enables additional arithmetic operations (such as squaring a string) and contains all first-order polyregular transductions. Finally, we show that masked average-hard attention transformers can simulate S-RASP.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.
Department
Linguistics and Language Development
Recommended Citation
Lena Strobl, Dana Angluin, David Chiang, Jonathan Rawski, and Ashish Sabharwal. "Transformers as Transducers" Transactions of the Association for Computational Linguistics (2025): 200-219. https://doi.org/10.1162/tacl_a_00736