Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It would be quite easy to give an LLM instructions to discourage suicide

This assumes the person talking to the LLM is in a coherent state of mind and asks the right question. LLMs just give you want you want. They don't tell you if what you want is right or wrong.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: