Everything I've learned so far about running local LLMs

from blog Simon Willison's Weblog, | ↗ original
Everything I've learned so far about running local LLMs Chris Wellons shares detailed notes on his experience running local LLMs on Windows - though most of these tips apply to other operating systems as well. This is great, there's a ton of detail here and the root recommendations are very solid: Use llama-server from llama.cpp and try ~8B...