Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is the commenter above you implying humans hallucinate to the level of LLMs? Maybe hungover freshman working on a tight deadline without having read the book do, but not professionals.

Even a mediocre employees will often realize they’re stuck, seek assistance, and then learn something from the assistance instead of making stuff up.



Depending on the country / culture / job description, "making stuff up" is sometimes a viable option for "adjust accordingly", on all levels of expertise.


> Even a mediocre employees will often realize they’re stuck, seek assistance, and then learn something from the assistance instead of making stuff up.

Only if they’re aware of their mediocrity. It’s the ones who aren’t, who bumble on regardless who are dangerous - just like AI.


People commonly realize when they are stuck... But note, the LLM isn't stuck, it keeps producing (total bullshit) material, and this same problem happens with humans all the time when they go off on the wrong tangent and some supervisory function (such as the manager of a business) has to step in and ask wtf they are up to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: