Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>confabulation isn't a normal thing which humans do

> A normal person is aware of the limits of their knowledge, for whatever reason, LLMs are not.

Eh, both of these things are far more complicated. People perform minor confabulations all the time. Now, there is a medical term for confabulation to about a more serious medical condition that involves high rates of this occurring coupled with dementia, and would be the less common form. We know with things like eye witness testimony people turn into confabulatory bullshit spewing devices very quickly, though likely due to different mechanisms like recency bias and over writing memories by thinking about them.

Coupled with that, people are very apt to lie about things they do know and can do for a multitude of reasons and attempting to teach an LLM to say "I don't know" when it doesn't know something, versus it just lying to you and saying it doesn't know will be problematic. Just see ChatGPT getting lazy in some of its releases for backfire effects like this.



Classic confabulation is observed with some kinds of alcohol related brain damage where people drink and get malnourished for a period of years. People with these syndromes create quite coherent complex stories which they will not be able to recall subsequently. This is quite different to filling in the blanks for remembered conversations where later on there is an opportunity for error correction. With confabulation there is not as it's tightly bound to memory impairment.

So I'm in the camp where LMMs are confabulating - and there's and I personally think the argument that they can be seen as confabulation machines has some validity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: