Llama 3.2: New Edge AI and Vision Models

from blog Tao of Mac, | ↗ original
The timing for this is great, as I’m starting to get back to shoving LLMs into single-board computers. The 128K token context length seems to be becoming a standard of sorts (which is also nice), and I might actually try the vision models as well (with the usual caveats about SBC NPUs being quite limited in both TOPS and data formats). And, of...