A ChatGPT clone, in 3000 bytes of C, backed by GPT-2

from blog Nicholas Carlini, | ↗ original
This program is a dependency-free implementation of GPT-2, including byte-pair encoding and transformer inference, in ~3000 bytes of C. I then use this to create something like Chat GPT.