Zig Tokenizer

from blog Mitchell Hashimoto, | ↗ original
Tokenization is the first step in a typical compiler pipeline. Tokenization is the process of converting a stream of bytes (the programming language syntax) into a stream of tokens.