Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The fundamental difference is qualia, which is physically inexplicable, and which LLMs and neural networks show no sign of having. It's not even clear how we would know if it did. As far as I can tell, this is something that escapes all current models of the physical universe, despite what many want to believe.


How do we know that they don't have qualia? Qualia are by definition private and ineffable. You can kind of sort of infer that other persons have them like you do on the basis of their public responses to them; but that is only possible due to the implicit assumption that other persons function in broadly the same way that you do, and thus qualia give rise to the same visible output. If that assumption doesn't hold, then your ability to infer presence of qualia to reactions (or lack thereof) to them is also gone.

But also, the very notion of qualia suffers from the same problem as other vague concepts like "consciousness" - we cannot actually clearly define what they are. All definitions seem to ultimately boil to "what I feel", which is vacuous. It is entirely possible that qualia aren't physically real in any sense, and are nothing more than a state of the system that the system itself sets and queries according to some internal logic based on input and output. If so, then an LLM having an internal self-model that includes a state "I feel heat" is qualia as well.


> All definitions seem to ultimately boil to "what I feel", which is vacuous. It is entirely possible that qualia aren't physically real in any sense, and are nothing more than a state of the system that the system itself sets and queries according to some internal logic based on input and output. If so, then an LLM having an internal self-model that includes a state "I feel heat" is qualia as well.

Is it "possible"? Absolutely. However, I have no means by which to measure, as you say, where at least with humans and animals I posit that their shared behaviors do indicate the same feeling, so I have some proof.

With an LLM, the output is a probability distribution.

Moreover, if it is as you say it is, then computers have qualia as well, which is scary because we would be committing a pretty ethically dubious 'crime' (at least in some circumstances).

Again, I just don't see it. Anything is possible, as I said, but not everything is as likely, by my estimation. And yes, that is entirely how I feel, which is as real as anything else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: