>a super-intelligence just beats human intelligence [e]very time.
You are overstating the case here. Super intelligence is superior to human intelligence, but it isn't magic. There are situations where an advantaged human will beat a disadvantaged super intelligence.
I suppose I am. Still, a really powerful optimising process will find a way to escape if any such way exists, so to claim that you could properly box the AI is to claim that you could box it such that no possibility for escape exists whatsoever, which is a big claim.
What's more, the AI only has to beat you once, so to keep the AI boxed indefinitely, the advantaged human has to beat the disadvantaged super-intelligence every single time, forever.
> the advantaged human has to beat the disadvantaged super-intelligence every single time, forever.
There are alternatives. The human can keep the AI boxed until the AI has augmented the human's intelligence, or helped create human uploads, or until it has helped create a provably friendly AI.
You are overstating the case here. Super intelligence is superior to human intelligence, but it isn't magic. There are situations where an advantaged human will beat a disadvantaged super intelligence.