Quoting François Chollet

from blog Simon Willison's Weblog, | ↗ original
A common misconception about Transformers is to believe that they're a sequence-processing architecture. They're not. They're a set-processing architecture. Transformers are 100% order-agnostic (which was the big innovation compared to RNNs, back in late 2016 -- you compute the full matrix of pairwise token interactions instead of processing one...