Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, because if you forbid the dissemination of nuclear research and materials then it's very difficult even for highly-resourced organizations (e.g. government of Iran) to produce nuclear devices, but if you would tomorrow forbid the dissemination of all future "AI research" then there are probably a million of people worldwide who would each be able to recreate the "dangerous capabilities" based on what they already have in their heads and current laptops, as long as someone gives them a few million $ worth of commodity (and thus not restrictable) computing hardware.

Nuclear technology is tricky to get working even if you have decent scientists and all the published research about basic principles. For current ML tech, a big point is that the core techniques are really simple, the simple textbook methods work at large scale, you don't necessarily need any special sauce which could be restricted. Yes, there is a lot of engineering work that goes into a full system to scale it efficiently, but all of that can be replicated by any decent bunch of engineers, unlike nuclear weapons.

The cat is out of the bag; you can't put the genie back in the bottle, all the "sensitive knowledge" you'd need to restrict AI is already known by at least a few CS undergrads in every single university in the world.



For nuclear research you could be right, but not so much with bioweapons. Already 20 years ago you could manufacture the polio virus from scratch just using publicly available data. https://www.science.org/content/article/poliovirus-baked-scr...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: