Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was just thinking today how I wonder what kind of abliterated models the US security apparatus is cooking up and what they're using them for. These kinds of things were a lot more fun when they were just silly dan brown novels and not real horrors on earth.


AFAIK, nation-state LLM's are likely using models that don't need to be abliterated. Why introduce a step that cripples their performance? Do you truly need refusals when trying to figure out zero days? I might need to watch Psycho Pass again.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: