Right, but LLMs are also consuming AWS product documentation and Terraform language docs, some things I have read a lot of and they’re often badly wrong on things from both of those domains, which are really easy for me to spot.
This isn’t just “shit in, shit out”. Hallucination is real and still problematic.
It's funny you say that because I was going to echo your parents sentiment and point out it's exactly the same with any news article you read.
The majority if content these LLMs are consuming is not from domain experts.