Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Lying needs intent. ChatGPT does not think therefore it doesn’t lie in that sense.


Thats like saying robots don't murder - they just kill


Which is actually a very good analogy. A lot of things can kill you, but only a human can be a murderer.


In movies and written fiction, "intelligent" robots, anthropomorphized animals, elves, dwarves and etc can all commit murder when given the attributes of humans.

We don't have real things with all human attributes but we're getting closer and as we get close "needs to be a human" will get thinner as an explanation of what is or isn't human for an act of murder, deception and so-forth.


And pit bulls, but I digress. The debate gets lost in translation when we start having what do words mean debate.


This is an interesting discussion. The ideas of philosophy meet the practical meaning of words here.

You can reasonably say a database doesn't lie. It's just a tool, everyone agrees it's a tool and if you get the wrong answer, most people would agree it's your fault for making the wrong query or using the wrong data.

But the difference between ChatGPT and a database is ChatGPT will support it's assertions. It will say things that support it's position - not just fake references but an entire line of argument.

Of course, all of this is simply duplicating/simulating for humans in discussions. You can call it is a "simulated lie" if you don't like the idea of it really lying. But I claim that in normal usage, people will take this as "real" lying and ultimately that functional meaning is what "higher" more philosophical will have to accept.


Somewhat unrelated, but I think philosophy will be instrumental in the development of actual AI. To make artificial intelligence, you need to know what intelligence is, and that is a philosophical question.


Merriam-Webster gives two definitions for the verb "lie". The first requires intent, the second does not:

> to create a false or misleading impression

> Statistics sometimes lie.

> The mirror never lies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: