The role of quantum mechanics was originally to describe the behavior of individual atoms and molecules. This is the way of the Schrodinger equation or matrix mechanics, the harmonic oscillator, the hydrogen atom, infinite-dimensional Hilbert spaces, quantization, and so on. I would like to call it "hermitian" quantum mechanics since the Hamiltonian is a hermitian operator.
Then there is the quantum mechanics which describes engineered quantum systems like quantum dots and quantum logic gates. Here time evolution is in discrete steps, Hilbert spaces are finite-dimensional, and probabilities are discrete instead of continuous. I think it is apt to call this "unitary" quantum mechanics since one essentially only considers exponentiated Hamiltonians.
It is important not to confuse the two. If you know hermitian quantum mechanics then unitary quantum mechanics is conceptually straightforward. If you know unitary quantum mechanics then you will have a lot of new concepts and mathematics to learn before you understand hermitian quantum mechanics (but of course you may know more about applications).
The programs mentioned in the article teach unitary quantum mechanics: sufficient for engineering, insufficient for physics. If we assume that the engineering world is becoming increasingly quantum then it is perhaps not a bad thing.
There is radioactivity which was (and still) one of the most important aspects in physics (nuclear physics) and it is mainly about describing decaying state. This is using non-hermitian QM. Because of hermitian operators giving always real eigenvalues (This exercise is left to the reader) we can see (and prove) that non hermitian operators will give rise to complex eigenvalues whose imaginary eigenvalues mean that the probability of finding the particle decreases exponentially with time (decay).
That is actually an approximation that will violate the QM postulate that evolution shouls be unitary (and probability is not conserved obviously).
People who study that in a more rigorous way will go and define somehow bigger Hilbert space that not only include the particle (atom) but will also include the decay products and only when you solve the system with the states of mother plus daughters you will return to your ordinary simple/ish quantum mechanics.
The idea is that the decrease of probability of finding the particle will be opposed by increasing probability of finding decay products. So the total probability will be conserved and we will have unitary time operator.
Hint: It is not simple as ordinary QM when you sometimes have to worry about resonances, mixed states and modeling these things mathematically is much difficult that solving your ordinary hermitian hamiltonian.
I've started recently in a subset of the domain and I feel that there is very strong potential in quantum computing, but there are still very important fundamental problems that have not been solved, so I believe that the hype comes too early.
There are very interesting algorithms that run on existing quantum machines. They are quite specialized (don't expect to run a quantum videogame any time soon, but expect to be able to simulate large physical systems or networks or power grids, etc.) but such algorithms could, if we had a sufficient number of stable qbits, solve problems that require entire datacenters, faster and with machines that require only a few times the energy of a desktop PC. However, no machine has a sufficient number of stable qbits, and it's currently unclear how to build machines that both have sufficient qbits and can run the same algorithms.
In a way, we're currently where traditional computing was in the early 50s. There is a feeling that all the difficult problems have been solved and the race is on between ~40 companies to be the first to build such machines. Everybody's marketing department claims to have the solution (or even to have working machines that can run these algorithms), but to the best of my knowledge, nobody has demonstrated them. There is a feeling that quantum computing will change the world. But I doubt it will happen overnight.
I suppose my question would be - are we in the early fifties or still back in the forties?
By 1951 there were stored program computers that were running real programs that did useful things. Yeah, they were super-expensive, unreliable, and laughably limited by today’s standards, but they were a real thing and even at that stage they had applications that people who weren’t tinkering with the machines themselves cared about.
We don’t have that with quantum computing yet, do we?
I don't think that there is a 1:1 mapping. If this is any datapoint, there are several companies (including the one for which I work) that offer cloud access to their quantum machines, letting other labs/companies develop algorithms to solve their real problems with these machines.
Is 100% stable qbits that much of a requirement? I had the impression that allowing some instability (which you have in any real physical environment) is good for running any kind of approximative algorithms, i.e. algorithms that give you a good enough solution with a high enough probability.
Clarifying because my previous message was over-ambitious: I started recently in the domain, so I'm still very much in the learning phase. Take everything I write with a pinch of salt.
That being said, to the best of my understanding, you are absolutely right. It is entirely possible to have algorithms without stable/corrected qbits. However, developing an algorithm without stable/corrected qbits is something that can take years of research and a PhD in quantum mechanics – the only part that looks remotely like programming is that quantum algorithm researchers use Python at some point in their toolchain to setup the system.
On the other hand, while with stable/corrected qbits, there is the hope that, some day, the industry can build quantum processors with gates comparable to the logical gates that power today's computers. This would in turn let developers program with quantum programming languages – in fact, some quantum programming languages that are recognizable as programming languages have already been designed, they just can't run on any actual hardware yet.
I’m starting a PhD in a related field, and it seems like the biggest reason to be excited is simulation of small and complex physics systems, e.g. molecular interactions.
Breaking RSA/ECDSA is very cool, but doesn’t actually enable new industries or products. We’ll just shift to using different cryptography that quantum computing can’t break.
Maybe quantum key distribution will become really important in some sectors. But if you aren’t seriously worried about man-in-the-middle attacks on your communications, QKD won’t make a difference in your life.
But efficiently simulating chemical and physical interactions could open up whole industries with advancements in material science, pharmaceuticals, etc.
"But efficiently simulating chemical and physical interactions"
Are there real chances, this is going to work reliable anytime soon?
I don't know much about quantum computing, but to me it seems, I rather would bet on GPUs for large simulations (as far as I know, currently they are mostly calculated on CPUs).
If you want to faithfully capture real-time quantum effects your program’s memory and runtime scales exponentially with the number of particles. A quantum simulation scales polynomially.
I wrote some quantum library code, ended up interviewing with 2 quantum startups and Amazon's lab at CalTech
For people who've been in the industry for a while, the past 5 years have been amazing - (small, noisy = NISQ) quantum computers are real, there's VC and government money in it. Aside from the industries you mentioned, there's interest in chemistry and medicine (for example: https://www.proteinqure.com ) and you can work with smart people. Edit: should also mention post-quantum encryption research which is adjacent to this space and in practice at Google, Cloudflare, Microsoft.
That said, if I joined a quantum company 2-5 years ago and worked long startup hours, I could be disappointed that it's not practical yet, and ML / NLP has taken off. So it's up to you what your alternatives are and what seems like a "win".
He's being sarcastic about the slow progress. They can barely factor small numbers like this and that's only after assuming that the answer will be 3*7.
Disclaimer: I am NOT involved in the industry. I program (classical) computers for a living, and have no degrees.
As I understand it, a true power of a quantum computer is simulations of quantum systems. A classical supercomputer running modeling software (based on DFT?) being replaced by a quantum computer I think would be one of the largest (in terms of economic impact) early uses of quantum computing.
Drug discovery (small molecule drugs!), materials science, anything that would benefit from a substantial/revolutionary increase in our computational chemistry reach -- those are the reasons I care.
I was pretty excited about the potential for better quantum simulations (and still am) but interestingly enough, at least in specific areas like protein folding, transformers and LLMs and all have made such huge leaps over the past few years that the bars gotten higher for quantum computers, at least in protein engineering. Plus probably some waning in interest when you go from "man it would be great if quantum computers solved this" to "oh wow these AI models can approximate solutions to a lot of the cases I care about I should focus on that now."
Obviously having an analytic solution to a protein structure would be awesome but when protein design tools are approaching double digit success percentages (depends on the protein) in creating designs that work in the lab - that's a lot more compelling than waiting for IBM!
Quantum is still cool though and people should do it since we'll outgrow classical computing for this stuff soon enough (what if two proteins?)
By contrast, it took roughly 10 years from the invention of the transistor in 1947 to a 30,000 transistor computer (IBM 7070 in 1958) and a fully functional 100 transistor MOSFET chip in 1964.
Even vacuum tubes went from Triode invention in 1906 to Flip-Flop in 1918 to a computer in 1939 while discovering quantum mechanics at the same time.
Qubits are barely at 1000 in 2023 (invented at roughly 1988 but with a lot of groundwork beforehand) and they barely work. Progress on increasing that has been very slow.
Qubits are still a research problem and not an engineering problem.
Problem 2: Problems and algorithms don't map as easily as everybody claims
There's a lot of "Algorithm X is faster in Quantum than Classical."
A lot of those claims are of the form "If we can build quantum circuit P, D, and Z, we can map Algorithm X to a Quantum Computer." And a lot of those assumptions are, quite bluntly, bullshit. We can't build circuit P, D, or Z and make it work so it doesn't matter how well the theorists can map the algorithm.
This also all presupposes we don't have better classical algorithms. Whenever I talk to quantum computing folks they generally point out that the one thing we have a hope of mapping to quantum solidly, factoring, is an odd man out in the way it maps. A couple of them think that there's still some missing knowledge in classical algorithms around that.
Interesting that your question asking for realistic views from inside the industry elicited breathless hype comments from (presumably) outside the industry.
I can't really, there are some really exciting scientific things that may emerge in the new few years but anything related to consumer electronics is at least fifteen years off and will probably have quite slow adoption when it's available - that's especially true for anything security related.
Quantum ML will be game changing. Running model training on a quantum computer can find GLOBAL minima/maxima in fractions of a second, depending on the number and quality of the qubits of course.
This comment makes me feel like people talking like this will make “quantum” into the new crypto and grift a whole new generation of people. The cycle continues.
You're a genius. Quantum Blockchain that lets you bid on an NFT lootbox being in a particular superposition. Based on the probability, the price will be higher or lower. And immutable. Or something.
I have heard a lot of people talk about how this is going to solve np hard problems, but when I asked a cs PhD about it they were a lot more pessimistic.
I don't see how you can really get a decent grasp of quantum with an undergrad. The standard American physics curriculum has some quantum in sophomore year with Modern Physics, and then Quantum in Junior/Senior year. But you can't exactly skip mechanics, E&M and all of the mathematics (Calc 1-3, Diff EQ, Partial Diff EQ, Linear Algebra) you need a background in. So you pretty much need 2 years of prep to really start learning. Even if you add some specific technology courses around the engineering, how do you get around this undergraduate program not being a Physics or Applied Physics degree, without throwing the baby out with the bathwater?
I took all these courses as a non physics aligned computer science major in a top engineering program. I even had the theoretical probability and statistics requirements you need too. The fact was though that are base requirements meant if you came in without placing out of basic requirements like calculus 1, intro physics, etc, you were going to be there for six years or five years plus summers. This wasn’t technically allowed by they waved hands somehow and it was tolerated by the university system. I feel I had enough background to study a year of progressive quantum computing courses successfully (but didn’t as they weren’t available).
Of course in the last decade or so they relaxed a lot of non computer science requirements and offered more elective slots, dumbing down the requirements and offering greater specialization for industry. But my point is, you certainly can offer a quantum computing degree with sufficient depth in an American university. It’ll just be a hard degree.
You don't really need a deep background in the physics side of things to utilize or help build quantum systems. Most of quantum is serious, but still "traditional", engineering of various disciplines (RF, EE, OMECH, CS, etc.). The physics part is a comparatively small portion of it IME.
Excluding Diff Eq, the rest of the mathematics are standard if you are doing e.g. machine learning. Diff eqs are not that hard to pick up anyways. At higher levels of ML and information theory, the math involved in statistical mechanics are covered too, for example Ising models are generalized into Hopfield networks and message passing/belief propagation. Most of quantum computing boils down to a few very specific matrix gates. The actual finicky physical details have very little to do with the algorithmic implementations. Classical mechanics and EM are irrelevant here. You hire quantum computing people to figure out the algorithmic and compute stuff, if you want somebody to debug waveguides, there are plenty of unemployed EE graduates.
The hardest part of QM is relating it back to "actual physics", if you work with abstract systems such as qubits, then QM is not anywhere near as difficult
I think the trick here is quite clear from the article: these programs simply do not aim for a "decent grasp of quantum". That is at least my take-away from a course that does not consider the hydrogen atom "a real-world example"...
You can do applied quantum logic in an afternoon (with e.g. colab and cirq, qiskit, and/or tequila) but then how much math is necessary; what is a "real conjugate"?
In the same way you an "do ML" without knowing linear algebra and probability theory. Such people can barely extend anything, let alone design new models from scratch.
E.g. Quantum embedding isn't yet taught to undergrads, and can be quickly explained to folks interested in the field, who might not be deterred by laborious newspaper summarizations, and who might pursue this strategic and critical skill.
How many ways are there to roll a 6-sided die with qubits and quantum embedding?
It took years for tech to completely and entirely rid itself of the socially-broken nerd stereotypes that pervaded early digital computing as well.
How can we get enough people into QIS Quantum fields to supply demand for new talent?
While I somewhat regret selling most of my college textbooks back, I feel that cramming for non-applied tests and quizzes was something I needed to pay them for me to do.
TIL about memory retention; spaced repetition interval training and projects with written communications components in application
Note that, in practice, quantum logic does not have quantum hardware that can run it at the moment. So that might limit the usability of such knowledge for the near future.
i think quantum computing will look like web development and ML, there will be Djangos and TensorFlow and LangChains and VC influencer shills and loads of junior data science roles filled by English graduates and there will not be an iota of computer science in sight
Since when have universities pivoted from training for academia to training for industry?
While I was in university, the classic undergraduate Computer Science program was described to me by the program's advisors as being for academics. If I recall correctly, "those who get A's become professors, those who get C's go into industry."
That seems like a perverse incentive. "Those who get As become low-wage adjuncts and those who get Cs end up with some of the highest salaries in the country". Feels like getting an A is a good way to end up nowher
So which is it? Reading the last few HN threads about quantum computers gave me the impression that we don't even know if there will be any practical use for them, as they seem super specialized, and there are technical challenges that no one knows how to solve. I'm confused.
Purely speaking to the article, which is about careers. The open question is how much money will be put into the industry before payoff/abandonment. But the industry kicked off over 20 years ago and the trajectory looks great.
> Instead [of the Bohr model], Morello uses a real-world example in his teaching — a material called a quantum dot, which is used in some LEDs and in some television screens. “I can now teach quantum mechanics in a way that is far more engaging than the way I was taught quantum mechanics when I was an undergrad in the 1990s,” he says.
> Morello also teaches the mathematics behind quantum mechanics in a more computer-friendly way. His students learn to solve problems using matrices that they can represent using code written for the Python programming language, rather than conventional differential equations on paper.
>> This "Quantum Computing for Computer Scientists" video https://youtu.be/F_Riqjdh2oM explains classical and quantum operators as just matrices. What are other good references?
Unfortunately the QuantumQ game doesn't yet have the matrix forms of the quantum logical operators in the (open source) game docs.
Would be a helpful resource, in addition to the Quantum logic wikipedia page and numpy and/or SymPy without cirq:
A Manim presentation demonstrating that quantum logical operator matrices are Bloch sphere rotations, are reversible, and why we restrict operators to the category of unitary transformations
> His colleagues at the UNSW are also developing laboratory courses to give students hands-on experience with the hardware in quantum technologies. For example, they designed a teaching lab to convey the fundamental concept of quantum spin, a property of electrons and some other quantum particles, using commercially available synthetic diamonds known as nitrogen vacancy centres
A Manim walkthrough that flies from top-down to low flyover with the wave states at each point in the circuit would be neat. Do classical circuit simulators simulate backwards, nonlinear flow of current?
Then there is the quantum mechanics which describes engineered quantum systems like quantum dots and quantum logic gates. Here time evolution is in discrete steps, Hilbert spaces are finite-dimensional, and probabilities are discrete instead of continuous. I think it is apt to call this "unitary" quantum mechanics since one essentially only considers exponentiated Hamiltonians.
It is important not to confuse the two. If you know hermitian quantum mechanics then unitary quantum mechanics is conceptually straightforward. If you know unitary quantum mechanics then you will have a lot of new concepts and mathematics to learn before you understand hermitian quantum mechanics (but of course you may know more about applications).
The programs mentioned in the article teach unitary quantum mechanics: sufficient for engineering, insufficient for physics. If we assume that the engineering world is becoming increasingly quantum then it is perhaps not a bad thing.