Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I can't think of one example where someone was harmed by an LLM.

Not yet. Thats why there is this Act.

Using AI in law enforcement or doing a surgery could cause some harm.

LLMs like ChatGPT are in "low risk category" for which there should be no mandatory obligations.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: