1) Offline connectivity — pretty cool to be able to debug technical problems while flying (or otherwise off grid) with a local LLM, and current 8B models are usually good enough for the first line of questions that you otherwise would have googled.
2) Privacy
3) Removing safety filters — there are some great “abliterated” models out there that have had their refusal behavior removed. Running these locally and never having your request refused due to corporate risk aversion is a very different experience to calling a safety-neutered API.
Depending on your use case some, all, or none of these will be relevant, but they are undeniable benefits that are very much within reach using a laptop and the current crop of models.
2) Privacy
3) Removing safety filters — there are some great “abliterated” models out there that have had their refusal behavior removed. Running these locally and never having your request refused due to corporate risk aversion is a very different experience to calling a safety-neutered API.
Depending on your use case some, all, or none of these will be relevant, but they are undeniable benefits that are very much within reach using a laptop and the current crop of models.