Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a problem of false dichotomy. Sentience isn't a light switch. It's grains of rice that accumulate into a meaningful mass. This is also the reason the abortion debate will never be resolved. We can expect similar passion ex uncertainty from the future machine sentience debate.


> It's grains of rice that accumulate into a meaningful mass

Suppose we could temporarily reduce these grains into something very small. For example, putting a human into deep anesthesia. Is it immoral to destroy their body at that point in time since they are less sentient than an insect?

Or do we consider their potential for sentience, as they can hypothetically be removed from the state of anesthesia in the future?


I'd say an interruption in existing sentience is a very different thing. I have different feelings about abortion and euthanasia in brain death.


Yes, killing sleeping people is immoral.


Being under anesthesia is extremely different than being asleep.


should we have responsibility to maintain a simulation?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: