Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I noticed that offline LLM builds running on personal computers are now possible, but it seemed like all the solutions required the installation of dependencies, so I created a containerized solution that makes it easy to swap out the model in use: https://github.com/paolo-g/uillem


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: