Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is not really that worrying IMO - we already have weaponized toxins, viruses, and enough explosives to blow up the entire planet. So what if an AI can come up with something a little bit worse? It isn't the existence of these things that's stopping us all from killing each other.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: