Is this all still totally hype? Anything close to practical? Wikipedia[1] shows:
year 2001: factor 15
year 2011: factor 21
year 2011: factor 143
At that rate of progress we might be able to factor something useful before we reach the heat death of the universe. But I guess the topic is good for lots of scary headlines.
Since how long it's going to take to have a potent Shor's algorithm machine is mostly a matter of when and in what series a set of novel advances occur in a range of fields (physics and mathematics at least), there's a lot of uncertainty as to when that ball is going to drop. Given that, and how long it takes to get rid of legacy crypto even when we know it's dead as a doornail, and how long it takes to vet a cryptographic algorithm, I think there's no time like the present to get started on post-quantum crypto.
there's no time like the present to get started on post-quantum crypto
You present an interesting argument.
HOWEVER, the cynic in me questions why the NSA made similar arguments not too long ago. Is it because if everyone switches to new algorithms there will be myriad new code, with a plethora of new bugs? Won't it make NSA's life much much easier for years to come? How many years did it take us to find "goto fail; goto fail;"?
Nobody is arguing switching production systems to the new algorithms anytime remotely soon. The closest I've heard of is Google adding a mixed ECC/post-quantum crypto algorithm in Chrome's Canary (bleeding edge channel)[1], but the whole point of the construction is that you need to break both (or break XOR, I guess :P). As far as I can tell, we're still in the early "is this algorithm any good" stage as opposed to the "did we implement the algorithm right" stage.
More generally, its pretty easy to turn any security recommendation by the NSA into a plot (they recommend SHA-3 cause they broke it and not SHA-2!). I'd save my suspicion for cases where they (a) give advice contrary to independent experts or (b) give strangely specific guidance with no apparent rationale.
We are a very long way away from having quantum factorization of integers. Quantum computers will be more useful for things like chemistry problems in the near term.
A lot of the "progress" is masked by a "Wirth's law"-style effect related to the requirement for quantum error-correction in order to carry out long quantum computations. Briefly, a quantum memory can never be perfectly accurate, and the limit of known physical implementations is close to 1 error in 1000 "clock cycles". In order to overcome this, error-correction is required, which creates overhead. The long-term outlook is still generally positive.
One problem with scaling quantum computers is that while the individual qbits are small getting more and more stable, it is unclear how to miniaturize the stuff needed to read and write qubits. Those are still components you can touch with your hands without breaking them, so you can't have millions in one device.
[1] https://en.wikipedia.org/wiki/Quantum_computing