Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Yes, it’s particularly bad when the information found on the web is flawed.

It's funny you say that because I was going to echo your parents sentiment and point out it's exactly the same with any news article you read.

The majority if content these LLMs are consuming is not from domain experts.



Right, but LLMs are also consuming AWS product documentation and Terraform language docs, some things I have read a lot of and they’re often badly wrong on things from both of those domains, which are really easy for me to spot.

This isn’t just “shit in, shit out”. Hallucination is real and still problematic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: