Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Those talk about a mechanism to detect prompt injection. If that had been true, we should have seen the chatbot refuse, not lie.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: