Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, when the cherry picked videos from OpenAI have a cat springing a fifth leg for no reason, people are right to be skeptical.


I have several thoughts on the concept of 'hallucination'. Firstly, most people do it regularly. I'm not sure why this alone is indicative of not understanding. Secondly, if we think about our dreams (the closest we can get, in my view, to making the human brain produce images without physical reality interfering), then actually we make very similar hallucinations. When you think about your dreams and think about the details, sometimes there are things that just kind of happen and make sense at the time, but when you look further, you're kind of like 'huh, that was strange'.

The images we get from these neural networks are trained on looking pleasing, for some definition of pleasing. That's why they look good on the whole, but get into that uncanny valley the moment you go inspecting. Similar to dreams.

Whereas obviously real human perception is (usually) grounded in reality.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: