Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> One could also argue the real problem is the tech industry constantly ignoring regulations that were put in place for good reasons

They were good reasons. By definition, disruptive technologies change the situation. Sometimes for the better, sometimes not. You have to leave room for innovation or you stagnate.



ChatGPT is not disruptive enough to be used in law, end of story. It's a very impressive language model, but like any language model it will hallucinate, inventing arguments that sound impressive on a surface level but bear no legal authority whatsoever. That's simply not acceptable in a courtroom.


Everyone’s permitted to represent themselves pro se, and a pro se litigant could obviously use ChatGPT. What one can’t do is offer ChatGPT as legal advice, and that still seems like a solid reason for regulation, given how terrible and inaccurate some ChatGPT output has been.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: