chatgpt-shell goes offline

from blog lmno.lol @alvaro, | ↗ original
Since chatgpt-shell going multi-model, it was only a matter of time until we added support for local/offline models. As of version 2.0.6, chatgpt-shell has a basic Ollama implementation (llama3.2 for now). chatgpt-shell is more than a shell. Check out the demos in the previous post. For anyone keen on keeping all their LLM interactions offline,...