Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A LLM can't "realize" anything. Unless you are saying that LLMs are aware.


It's a term i used to explain that in 'thinking' mode LLMs will read their own output and call out things like incorrect math statements before posting to the user.

Now you probably want a debate about the term 'thinking' mode but i cbf with that. It's pretty clear what was meant and semantic arguments suck. Don't do that.


I want people to use correct terms, i don't think that is unreasonable.


I'm all for avoiding anthropomorphism of these things, but what word (or set of words) would you use instead?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: