AI More Stories

from blog matt.sh, | ↗ original
These were all generated locally using the 34 billion parameter Yi model quantized to 4 bits (20 GB total quantized instead of 70 GB natural width). It feels almost better than GPT-4 at some tasks. Also considering it costs nothing to run this model locally, the cost-benefit weighs it as much better than running up $3-$10 per day in hosted gpt-4 charges when trying to do personal experiments and research into...