Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd like to place a bet, but unfortunately, there's no point in placing a bet stating that by a certain date AIs would beat median human intelligence by huge margins. It would be like placing a bet on a prediction of any other extinction event. Impossible to collect, even if the prediction is right.

On the other hand, just for the fun of it, a couple of other predictions. Here: "true, human level AIs would not be developed until 8Tb RAM sticks would become a commodity". Another: "true, high fidelity, multi-censorial brain-computer interface will never be built".



It'll be interesting to see what happens with neural networks. Whether copying the brain is the "shortcut" to true AI, or in fact the only way to achieve it.


It is possible to bet on the apocalypse, though it's not very practical: http://lesswrong.com/lw/ie/the_apocalypse_bet/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: