I've started recently in a subset of the domain and I feel that there is very strong potential in quantum computing, but there are still very important fundamental problems that have not been solved, so I believe that the hype comes too early.
There are very interesting algorithms that run on existing quantum machines. They are quite specialized (don't expect to run a quantum videogame any time soon, but expect to be able to simulate large physical systems or networks or power grids, etc.) but such algorithms could, if we had a sufficient number of stable qbits, solve problems that require entire datacenters, faster and with machines that require only a few times the energy of a desktop PC. However, no machine has a sufficient number of stable qbits, and it's currently unclear how to build machines that both have sufficient qbits and can run the same algorithms.
In a way, we're currently where traditional computing was in the early 50s. There is a feeling that all the difficult problems have been solved and the race is on between ~40 companies to be the first to build such machines. Everybody's marketing department claims to have the solution (or even to have working machines that can run these algorithms), but to the best of my knowledge, nobody has demonstrated them. There is a feeling that quantum computing will change the world. But I doubt it will happen overnight.
I suppose my question would be - are we in the early fifties or still back in the forties?
By 1951 there were stored program computers that were running real programs that did useful things. Yeah, they were super-expensive, unreliable, and laughably limited by today’s standards, but they were a real thing and even at that stage they had applications that people who weren’t tinkering with the machines themselves cared about.
We don’t have that with quantum computing yet, do we?
I don't think that there is a 1:1 mapping. If this is any datapoint, there are several companies (including the one for which I work) that offer cloud access to their quantum machines, letting other labs/companies develop algorithms to solve their real problems with these machines.
Is 100% stable qbits that much of a requirement? I had the impression that allowing some instability (which you have in any real physical environment) is good for running any kind of approximative algorithms, i.e. algorithms that give you a good enough solution with a high enough probability.
Clarifying because my previous message was over-ambitious: I started recently in the domain, so I'm still very much in the learning phase. Take everything I write with a pinch of salt.
That being said, to the best of my understanding, you are absolutely right. It is entirely possible to have algorithms without stable/corrected qbits. However, developing an algorithm without stable/corrected qbits is something that can take years of research and a PhD in quantum mechanics – the only part that looks remotely like programming is that quantum algorithm researchers use Python at some point in their toolchain to setup the system.
On the other hand, while with stable/corrected qbits, there is the hope that, some day, the industry can build quantum processors with gates comparable to the logical gates that power today's computers. This would in turn let developers program with quantum programming languages – in fact, some quantum programming languages that are recognizable as programming languages have already been designed, they just can't run on any actual hardware yet.
I’m starting a PhD in a related field, and it seems like the biggest reason to be excited is simulation of small and complex physics systems, e.g. molecular interactions.
Breaking RSA/ECDSA is very cool, but doesn’t actually enable new industries or products. We’ll just shift to using different cryptography that quantum computing can’t break.
Maybe quantum key distribution will become really important in some sectors. But if you aren’t seriously worried about man-in-the-middle attacks on your communications, QKD won’t make a difference in your life.
But efficiently simulating chemical and physical interactions could open up whole industries with advancements in material science, pharmaceuticals, etc.
"But efficiently simulating chemical and physical interactions"
Are there real chances, this is going to work reliable anytime soon?
I don't know much about quantum computing, but to me it seems, I rather would bet on GPUs for large simulations (as far as I know, currently they are mostly calculated on CPUs).
If you want to faithfully capture real-time quantum effects your program’s memory and runtime scales exponentially with the number of particles. A quantum simulation scales polynomially.
I wrote some quantum library code, ended up interviewing with 2 quantum startups and Amazon's lab at CalTech
For people who've been in the industry for a while, the past 5 years have been amazing - (small, noisy = NISQ) quantum computers are real, there's VC and government money in it. Aside from the industries you mentioned, there's interest in chemistry and medicine (for example: https://www.proteinqure.com ) and you can work with smart people. Edit: should also mention post-quantum encryption research which is adjacent to this space and in practice at Google, Cloudflare, Microsoft.
That said, if I joined a quantum company 2-5 years ago and worked long startup hours, I could be disappointed that it's not practical yet, and ML / NLP has taken off. So it's up to you what your alternatives are and what seems like a "win".
He's being sarcastic about the slow progress. They can barely factor small numbers like this and that's only after assuming that the answer will be 3*7.
Disclaimer: I am NOT involved in the industry. I program (classical) computers for a living, and have no degrees.
As I understand it, a true power of a quantum computer is simulations of quantum systems. A classical supercomputer running modeling software (based on DFT?) being replaced by a quantum computer I think would be one of the largest (in terms of economic impact) early uses of quantum computing.
Drug discovery (small molecule drugs!), materials science, anything that would benefit from a substantial/revolutionary increase in our computational chemistry reach -- those are the reasons I care.
I was pretty excited about the potential for better quantum simulations (and still am) but interestingly enough, at least in specific areas like protein folding, transformers and LLMs and all have made such huge leaps over the past few years that the bars gotten higher for quantum computers, at least in protein engineering. Plus probably some waning in interest when you go from "man it would be great if quantum computers solved this" to "oh wow these AI models can approximate solutions to a lot of the cases I care about I should focus on that now."
Obviously having an analytic solution to a protein structure would be awesome but when protein design tools are approaching double digit success percentages (depends on the protein) in creating designs that work in the lab - that's a lot more compelling than waiting for IBM!
Quantum is still cool though and people should do it since we'll outgrow classical computing for this stuff soon enough (what if two proteins?)
By contrast, it took roughly 10 years from the invention of the transistor in 1947 to a 30,000 transistor computer (IBM 7070 in 1958) and a fully functional 100 transistor MOSFET chip in 1964.
Even vacuum tubes went from Triode invention in 1906 to Flip-Flop in 1918 to a computer in 1939 while discovering quantum mechanics at the same time.
Qubits are barely at 1000 in 2023 (invented at roughly 1988 but with a lot of groundwork beforehand) and they barely work. Progress on increasing that has been very slow.
Qubits are still a research problem and not an engineering problem.
Problem 2: Problems and algorithms don't map as easily as everybody claims
There's a lot of "Algorithm X is faster in Quantum than Classical."
A lot of those claims are of the form "If we can build quantum circuit P, D, and Z, we can map Algorithm X to a Quantum Computer." And a lot of those assumptions are, quite bluntly, bullshit. We can't build circuit P, D, or Z and make it work so it doesn't matter how well the theorists can map the algorithm.
This also all presupposes we don't have better classical algorithms. Whenever I talk to quantum computing folks they generally point out that the one thing we have a hope of mapping to quantum solidly, factoring, is an odd man out in the way it maps. A couple of them think that there's still some missing knowledge in classical algorithms around that.
Interesting that your question asking for realistic views from inside the industry elicited breathless hype comments from (presumably) outside the industry.
I can't really, there are some really exciting scientific things that may emerge in the new few years but anything related to consumer electronics is at least fifteen years off and will probably have quite slow adoption when it's available - that's especially true for anything security related.
Quantum ML will be game changing. Running model training on a quantum computer can find GLOBAL minima/maxima in fractions of a second, depending on the number and quality of the qubits of course.
This comment makes me feel like people talking like this will make “quantum” into the new crypto and grift a whole new generation of people. The cycle continues.
You're a genius. Quantum Blockchain that lets you bid on an NFT lootbox being in a particular superposition. Based on the probability, the price will be higher or lower. And immutable. Or something.
I have heard a lot of people talk about how this is going to solve np hard problems, but when I asked a cs PhD about it they were a lot more pessimistic.
A lot of the hype seems to come from physicists excited about physics (understandably), and spooks who want to crack public key encryption.
Can anyone convince me I should care?