Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Uillem – an offline, containerized LLM interface (github.com/paolo-g)
9 points by uillem on Aug 11, 2023 | hide | past | favorite | 7 comments


I noticed that offline LLM builds running on personal computers are now possible, but it seemed like all the solutions required the installation of dependencies, so I created a containerized solution that makes it easy to swap out the model in use: https://github.com/paolo-g/uillem


Nice, love seeing Paolo post this! He is a great guy, we used to work together, and I’m excited to see where he takes this.


Paolo indeed is a great guy. Can’t wait to hang out with him next.


This is pretty neat! Now I just need a good library of models to plug in haha


Check out Hugging Face. It's cool to be able to try out different models to see how they perform. RAM requirements can get steep for the larger models.


Very impressive, can't wait to give it a try!


Cool




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: