Hacker Newsnew | past | comments | ask | show | jobs | submit | prof-dr-ir's commentslogin

I agree that this behavior is insane and should be fixed.

Do however note that it is possible to install another keyboard on iOS, which may alleviate your suffering before you switch to Android in about 120 days.

Personally I rely on Gboard [0] every day for the simple reason that it auto-detects several (more than two) languages, and of course it has the added benefit of not having this crazy bug. Gboard is google software however, so it does come with huge privacy issues, and others will hopefully point out better alternatives.

[0]https://en.wikipedia.org/wiki/Gboard


Gboard for iOS has been discontinued though. On top of that, 3rd party keyboards are a bit limited on iOS (which might be a good thing for some people).


Has it? It's still on the App Store. Is it just not in active development?



Gboard is a lot better than the native keyboard. Strange that OP is going to such lengths to complain when iOS supports other keyboards.

The main benefit I've found with Gboard is a larger vocabulary, and perhaps a less aggressive autocorrect that doesn't constantly try to correct technical terms into similar common words.


I’m not suggesting this is the author’s reason, but avoiding a Google product that keep a trail of everything you type seems like a strong argument.


Via iOS permissions you can restrict internet access to the keyboard and it still works well.


I can guarantee that's 100% not his reason given that his stated alternative is switching to Android.


Yeah fair point


I’ve tried Gboard and SwiftKey on iOS.

Not sure if Google just gave up on updating the iOS variant or if Apple holds it back intentionally (probably a bit of both) but they pale in comparison to their Android counterparts.

I’d prefer a useable stock keyboard but I take your point.


SwiftKey also crashes daily. There's no good keyboard for iphones, at least for my stubby fingers. I've literally won local mobile typing speed contests (so it's not user error I dare say) and the last five-ish years that I've been on ios have been a total and utter misery. If I had some cash floating around I would be back on android.


One of the reasons in recent times to go to Apple ecosystem was supposedly better privacy protections and decoupling from dependency on Google. You would pay extra for the UX and privacy among other things. Installing third party keyboard means that they can see what I type.


Apple has a privacy setting for each keyboard that restricts if they have network access.


Glad to learn this! I didn't know about it but I also double-checked the UI and it says in a warning dialog that the third party will be able to see everything I type and the network setting isn't mentioned there.

Anyway, my point regarding the UX still stands. Apple's UX is barely as good as other major player's - not great, not terrible. Mediocrity isn't what Apple should be aiming at.


I have Gboard and have weird issues with it crashing randomly. Not sure if it's because it's hamstrung by the limitations of Apple's support for alternative keyboards or what.


Gboard hasn't been updated in 4 years and as a result the UI doesn't always display properly. It's especially jarring on iOS 26. It doesn't fit into the OS keyboard target area properly (on my iPhone 17 Pro, at least).

I've tried pretty much every reputable third-party keyboard app in the App Store. Unfortunately, there's really nothing better than the stock one.


Any properly supported third-party keyboards? Swiftkey was bought by Microsoft and lost my vote. Gboard stopped updating.


+1

I do the same, and I find it way better.


Among theoretical physicists there is little doubt that Edward Witten is currently the greatest living theoretical physicist. Here is an interview with him from a few weeks ago:

https://www.youtube.com/watch?v=sAbP0magTVY

I think it is a great watch for anyone with an interest in the field.


Yeah Witten is unusual. He's not just a little bit better than everyone else, he's on a different league.

I knew a someone who was a temp visitor at the Institute for Advanced Studies who was given temp office next to Witten. And he said he wouldn't hear a noise, and the one day he starts typing and doesn't stop until 100 pages of paper are written, like he has it finished in his mind before he starts typing. Somehow I'm inclined to believe it can't be far from truth.


eric weinstein said that witten is the dark lord of string theory; scotching any advance in theoretical physics.


Hasn't stopped Weinstein from publishing. That nobody takes him seriously isn't Witten's fault. At least... not directly. Witten just happens to be very rigorous and a very gifted mathematician, so he sets a high bar for the rest of the field.


Yes, and RFK Jr. says certain vaccines have never worked.

I guess what I want to convey is how sad your comment makes me. What went wrong that makes you, and anyone really, trust that man's opinion on physics?

Here is a cynical but overall rather accurate takedown of Mr. Weinstein:

https://www.youtube.com/watch?v=DUr4Tb8uy-Q


> I really miss Google Inbox

I very much agree. Also really miss the ability to quickly group related emails.

(And no, that was not the same as adding a label; for one, the whole group simply appeared as one "bigger" email in the Inbox. It was a bit like a thread that you can manually add emails to.)

When everybody got kicked out of Inbox I happened to have a group of about ten emails related to an upcoming trip. Those ten emails got de-grouped and scattered all around in the ordinary gmail interface. I would have appreciated a smoother transition...


The article is just generally sloppy.

> .. my recent trip from Abu Dhabi to LA. 24 hours door-to-door. We have the technology to reduce that to under 10.

The direct flight (by Emirates) takes 16h15 mins, so that leaves 7h45 mins not in flight. If we want to bring that down to 10 hours just by making the flight supersonic then that would require a flight time of 2h15, corresponding to a (ridiculous) speed well over Mach 4.


In fairness, Astro Mechanica and Hermeus claim to have a pathway to Mach 5. Not saying I expect to see it, particularly not for regular people flights to the Middle East, but believing in it is kind of the premise of the article.

(I must admit I was more curious about Astro Mechanica's engine tech before they also threw in the intention to operate Uber for business jets...)


Not ridiculous if you’re flying above the atmosphere. SpaceX has proposed point-to-point rocket-powered hypersonic flights that connect New York to Paris in around 30 minutes.

Obviously the real problem with this idea is environmental: emissions would be substantial and nobody wants an extremely noisy rocket port near their city.


How do you imagine that? First thing coming to mind is the loudness of rocket starts and powered landings. Even for airports that would be too loud. At least with current regulations. So you'd probably waste time getting to some dedicated facility, far out in the midst of nowhere to care about, and getting out of a similar hole on the other side of the trip. And again regulations regarding the closure of airspaces and seas for starts and landings, as it's currently done. Which seems rather incompatible with the current system of commercial flight ops, as it's currently done. Other relevant regulations coming to mind are evacution procedures/general survivability provisions for conventional commercial flights, which are mandatory by law.

However I turn that idea, no matter from which point I'm looking at it, I'm not seeing it going anywhere.


Whenever I hear people talk about rocket flights I think of the Stephen King short story "The Jaunt". Humans develop near-instant transportation but you have to be unconscious while travelling. A kid avoids being sedated and is driven insane by whatever interdimensional stuff he sees in transit.

Likewise for every fit 20-something being launched at Mach 5 you'd have 10 octogenarians dying of cardiovascular complications.


And furthermore you would be able to start only in good weather window for takeoff and landing and Gs on Gemini flights (which were doing the same thing) weren't comfortable either.


Musk has proposed lots of things.


good train > plane > bad train


Driving is cheaper than all of those options when its more than one person, at least here in Florida. The more people involved in your travel, the cheaper driving starts to get.


Some pretty bad traffic in Miami. Last month it took me an hour to go 10 miles on a Thursday afternoon.


Hehe, my wife is from there, for the first few years she lived in Orlando I would crack jokes about how bad it is down there, noticed she was getting offended so I pulled back, but the last three times we've gone down there she swears up and down Miami drivers are the worst. Of course Orlando has I-4 as well, which is, its own special place.


Get a comma ai and it's just not a bother.


I am confused, since even factoring 21 is apparently so difficult that it "isn’t yet a good benchmark for tracking the progress of quantum computers." [0]

So the "useful quantum computing" that is "imminent" is not the kind of quantum computing that involves the factorization of nearly prime numbers?

[0] https://algassert.com/post/2500


Factoring will be okay for tracking progress later; it's just a bad benchmark now. Factoring benchmarks have little visibility into fault tolerance spinning up, which is the important progress right now. Factoring becoming a reasonable benchmark is strongly related to quantum computing becoming useful.


> Factoring becoming a reasonable benchmark is strongly related to quantum computing becoming useful.

Either this relation is not that strong, or factoring should "imminently" become a reasonable benchmark, or useful quantum computing cannot be "imminent". So which one is it?

I think you are the author of the blogpost I linked to? Did I maybe interpret it too negatively, and was it not meant to suggest that the second option is still quite some time away?


Under a Moore's law-like scenario, factoring 21 happens only about ~5 years before factoring a 1024 bit number. With all the optimizations, factoring an n bit number only requires ~n logical qbits, but most of those optimizations only work for large n, so 21 is only ~5 doubles away from 2^1024.

the other problem is that factoring 21 is so easy that it actually makes it harder to prove you've factored it with a functional quantum computer. for big numbers, your program can fail 99% of the time because if you get the result once, you prove that the algorithm worked. 21 is small enough that it's hard not to factor, so demonstrating that you've factored it with a qc is fairly hard. I wouldn't be surprised as a result if the first number publicly factored by a quantum computer (using error correction) was in the thousands instead of 21. By using a number that is not absolutely tiny, it becomes a lot easier to show that the system works.


Perhaps? The sort of quantum computers that people are talking about now are not general purpose. So you might be able to make a useful quantum computer that is not Shor's algorithm.


Simulating the Hubbard model for superconductors at large scales is significantly more likely to happen sooner than factoring RSA-2048 with Shor’s algorithm.

Google have been working on this for years

Don't ask me if they've the top supercomputers beat, ask Gemini :)


Gemini hallucinated me a wild answer.


I don't think that's correct, the research projects the article is talking about all seem to aim at making general purpose quantum computers eventually. Obviously they haven't succeded yet, but general purpose does seem to be what they are talking about.


Basically qc are so far from ever doing a useful computation we need other benchmarks to measure progress. We need to be thinking in timelines of our lifetime not 5 years


I always find this argument a little silly.

Like if you were building one of the first normal computers, how big numbers you can multiply would be a terrible benchmark since once you have figured out how to multiply small numbers its fairly trivial to multiply big numbers. The challenge is making the computer multiply numbers at all.

This isn't a perfect metaphor as scaling is harder in a quantum setting, but we are mostly at the stage where we are trying to get the things to work at all. Once we reach the stage where we can factor small numbers reliably, the amount of time to go from smaller numbers to bigger numbers will be probably be relatively short.


From my limited understanding, that's actually the opposite of the truth.

In QC systems, the engineering "difficulty" scales very badly with the number of gates or steps of the algorithm.

Its not like addition where you can repeat a process in parallel and bam-ALU. From what I understand as a layperson, the size of the inputs is absolutely part of the scaling.


But the reason factoring numbers is used as the quantum benchmark is exactly that we have a quantum algorithm for that problem which is meant to scale better than any known algorithm on a classical computer.

So it seems like it takes an exponentially bigger device to factor 21 than 15, then 35 than 21, and so on, but if I understand right, at some point this levels out and it's only relatively speaking a little harder to factor say 10^30 than 10^29.

Why are we so confident this is true given all of the experience so far trying to scale up from factoring 15 to factoring 21?


> Why are we so confident this is true given all of the experience so far trying to scale up from factoring 15 to factoring 21?

I don't think we have any "real" experience scaling from 15 to 21. Or at least not in the way shor's algorithm would be implemented in practise on fault tolerant qubits.

We haven't even done 15 yet in a "real" way yet. I susect the amount of time to factor 15 on fault tolerant qubits will be a lot longer than the time to go from 15 to 21.


The algorithm in question is a hypothetical algorithm for a hypothetical computer with certain properties. The properties in question are assumed to be cheap.

In the case of quantum algorithms in BQP, though, one of those properties is SNR of analog calculations (which is assumed to be infinite). SNR, as a general principle, is known to scale really poorly.


> In the case of quantum algorithms in BQP, though, one of those properties is SNR of analog calculations (which is assumed to be infinite). SNR, as a general principle, is known to scale really poorly.

As far as i understand, that isn't an assumption.

The assumption is that the SNR of logical (error-corrected) qubits is near infinite, and that such logical qubits can be constructed from noisey physical qubits.


There are several properties that separate real quantum computers from the "BQP machine," including decoherence and SNR. Error-correction of qubits is mainly aimed at decoherence, but I'm not sure it really improves SNR of gates on logical qubits. SNR dictates how precisely you can manipulate the signal (these are a sort of weird kind of analog computer), and the QFTs involved in Shor's algorithm need some very precise rotations of qubits. Noise in the operation creates an error in that rotation angle. If your rotation is bad to begin with, I'm not sure the error correction actually helps.


> The assumption is that the SNR of logical (error-corrected) qubits is near infinite, and that such logical qubits can be constructed from noisey physical qubits.

This is an argument I've heard before and I don't really understand it[1]. I get that you can make a logical qubit out of physical qubits and build in error correction so the logical qubit has perfect SNR, but surely if (say the number of physical qubits you need to get the nth logical qubit is O(n^2) for example, then the SNR (of the whole system) isn't near infinite it's really bad.

[1] Which may well be because I don't understand quantum mechanics ...


The really important thing is that logical qbit error decreases exponentially with error correction amount. As such, for the ~1000 qbit regime needed for factoring, the amount of error correction ends up being essentially a constant factor (~1000x physical to logical). As long as you can build enough "decent" quality physical qbits and connect them, you can get near perfect logical qbits.


Having demonstrated error correction, some incremental improvements can now be made to make it more repeatable and with better characteristics.

The hard problem then remains how to connect those qubits at scale. Using a coaxial cable for each qubit is impractical; some form of multiplexing is needed. This, in turn, causes qubits to decohere while waiting for their control signal.


This is quite falatious and wrong. The first computers were built in order to solve problems immediately that were already being solved slowly by manual methods. There never was a period where people built computers so slow that they were slower than adding machines and slide rules, just because they seemed cool and might one day be much faster.


Charles babbage started on the difference engine in 1819. It took a very long time after that before computers were useful.


Additionally, part of the problem was that metal working at the time wasn't really advanced enough to make the required parts to the necessary precision at a reasonable price. Which sounds really quite similar to how modern quantum computers are right at the edge of current engineering technology.


The fact that it does appear to be so difficult to scale things up would suggest that the argument isn't silly.


Actually yes, how much numbers you can crunch per second and how big they are were among the first benchmarks for actual computers. Also, these prototypes were almost always immediately useful. (Think of the computer that cracked Enigma).

In comparison, there is no realistic path forward for scaling quantum computers. Anyone serious that is not trying to sell you QC will tell you that quantum systems become exponentially less stable the bigger they are and the longer they live. That is a fundamental physical truth. And since they're still struggling to do anything at all with a quantum computer, don't get your hopes up too much.


That would be the bombe, which didn't really crunch numbers at all, but was an electromechanical contraption to automate physically setting Enigma rotors to enumerate what combinations were possible matches.


> Anyone serious that is not trying to sell you QC will tell you that quantum systems become exponentially less stable the bigger they are and the longer they live.

If what you are saying is that error rates increase exponentially such that quantum error correction can never correct more errors than it introduces, i don't think that is a widely accepted position in the field.


People who believed that would opt out of being in the field, though. No?


> what is this new style of writing

Congratulation, you are now able to recognize an AI-generated text.

(As of December 2025 at least, who knows what they will look like next month.)


I don't mind the style but the factual errors are not good. Like "How NVIDIA..." when it was done by DeepMind with TPUs.


Since May 2024 he isn't actively trying anything at all.


If you're curious about Simons, and the wikipedia page is not enough, I found "The Man Who Solved the Market" by Gregory Zuckerman an interesting read.


His foundations are still doing good work.


Frankly I am so tired of this whole branch of research where people try to be foundational about "quantum theory" but at the same time boil it down to qubits, gates, bell tests and, well, two-by-two matrices.

Here is my viewpoint, which somehow some people find controversial: quantum theory is first and foremost a description of individual particles. To describe their time evolution, we use the Schrodinger equation:

i d_t Psi = H Psi

What is that "i" there? Oh right, the imaginary unit. So... quantum theory uses complex numbers.

Now you are free to search for another theory without the "i", and perhaps even find something that is somehow mathematically consistent. But that theory either describes experiments just as well as ordinary quantum theory, in which case it is physically equivalent and of no advantage (except to those with strong allergies to complex numbers), or it does not, and then it is wrong.

Of course the last logical possibility is that your theory might do better than quantum theory... but that is the dream only of those who do not known quantum field theory.

/rant, with apologies


There is really nothing to the appearance of complex numbers in QM. In QM we must design wave functions which do the double duty of representing the probability of measurement outcomes AND capture the symmetries implicit in the system related to the fact that there are degrees of freedom between preparation of a state and measurement (for example, we may rotate our detector any way we wish before we make a measurement of a particle in a given prepared spin state). To accomplish this we need some number-like objects to denote our wave function in that square to real numbers but have enough structure to represent (in this case) the rotations.

As you venture further into the universe of QFT you find that you need even more exotic number like objects like spinors with their own peculiar structures, but the essence is the same: they must serve the purpose of representing probabilities and symmetries. The complex numbers in QM mean nothing at all except in that they serve these purposes.

If we wish to speak informally and wave our hands a bit we can say that it isn't so surprising that we find the complex numbers and related number like objects because the complex numbers are a promise to square something at a later date and recover a real number, which is what we need to satisfy the requirement to represent probabilities.

In fact, we can formulate classical probabilistic mechanics with complex numbers (the Koopman von Neuman operator theory) and again, they appear because we want to operate on objects living in a nice Hilbert space which also square to probabilities. In only took me 20 years to understand this, so I can sympathize with confusion.


It's a long time since I read it, but there's a book called "The Structure and Interpretation of Quantum Mechanics" [1] by R. I. G. Hughes. The "Structure" part of it begins by building up most of the mathematical framework (including use of complex numbers, Hilbert spaces, operators, etc), motivated only by the desire to build a physical theory that is probabilistic in nature. It then shows how you can add one extra ingredient that turns the framework into that used for quantum mechanics [2]. I assume that everything discussed up to that point applies equally to Koopman-von Neumann.

It's a really nice book, very self-contained. I think anyone with a basic mathematical education (A-Level or equivalent) could get through it without having to read other things to acquire prerequisites, though they should be prepared to think quite hard.

1. The resemblance to the titles of Gerald Jay Sussman's "Structure and Interpretation" books appears to be coincidental. The title is meant literally: the book is split into two sections, one on the (mathematical) structure of QM and one on its (philosophical) interpretation. There are no similarities in style, pedagogy or subject matter to Sussmann's books and no use of, or reference to, programming. The author was a professor of philosophy at the University of South Carolina.

2. He actually lists a collection of alternatives for that extra ingredient, any one of which has the same effect when added.


It's nice to see this reference. I'm currently reading it and about halfway through (making my way through the chapter on Quantum Logic).

The discussion of the EPR paradox and the Kochen-Specker Theorem was really very illuminating.


It is one of my favorites.


> complex numbers are a promise to square something at a later date and recover a real number

Except, most complex numbers don't square to a real number. Only those lying along the complex or real axes square to a real number; everything else just squares to another (non-real) complex number. In what way do complex numbers represent a "promise" to square it later and recover a real number? Who is making this promise? I feel like this is falling into the same trap of believing that complex numbers are not allowed to simply exist on their own merit.

I think it's quite serendipitous that the number system designed to algebraically close the reals to include roots of polynomials like x^4 + 1 happens to also cleanly describe so much of physics. There happens to be a lot of physics that boils down to "magnitude and phase" where those quantities interact in the same way complex numbers do, but it's not a-priori obvious that electromagnetism shouldn't need some third quantity as well, nor that we shouldn't be using quaternions instead, nor some other algebraic structure defined over 2D or 3D or 4D vectors.

Indeed, as you point out, there are plenty of more complicated mathematical structures that are best for describing other parts of physics, like spinors, Lie groups, and special unitary groups. It's not a-priori obvious that Lie groups should be so important to physics either. But neither should anyone protest their use as somehow not "really existing". It is true that complex numbers do not physically exist -- neither do Lie groups, and neither does the number 7. We got lucky that mathematicians had already explored an algebra that turned out to be perfect for "magnitude-and-phase" physics, but it doesn't seem like "squaring to a real number" had anything to do with why they are useful. Real numbers have no stronger claim to truly representing physics than complex numbers, spinors, or Lie groups do.


I think this is just loose terminology, instead of squaring they should have said “multiply by the complex conjugate”, which is what you do to quantum mechanical wavefunctions to obtain real-valued probability amplitudes


> Real numbers have no stronger claim to truly representing physics than complex numbers, spinors, or Lie groups do.

Eh, call me when your detector gives you back a complex number. Measurements return real numbers. I've never known one to return a complex valued one. Probabilities are real numbers. I feel this puts real numbers in a privileged position. If you ever wrote a theory that suggested that you lay a ruler against an object and measure a complex value, you'd be in trouble.


> Eh, call me when your detector gives you back a complex number. Measurements return real numbers.

There are an uncountably infinite number of real numbers. 100% of them (but not all) are not computable, and cannot be written down or described. Measurements do not return "true" real numbers. Measurements return whatever the detector is designed to return. Digital measurements return binary floating-point, fixed-point, or integer numbers. Some measurements return "red" vs. "blue". Pregnancy detectors return "1 line" or "2 lines". All it would take for a detector to give a complex number is to design one that measures something that can be described as a complex number, and return it as a complex number. For example, a phasor measurement unit:

https://en.wikipedia.org/wiki/Phasor_measurement_unit

Should I call you?


I think there is a credible case to be made that all we ever actually measure is relative displacements in space. We design objects (physical or mathematical) to convert these displacements into quantities or units of interest and might even decorate such with some additional structure beyond the reals, but in the end, we are measuring distances relative to a standard. This account becomes somewhat tricky when digital and/or electronic measurements are taken into account, but goes through, I believe.

When I say measurements are real I mean that displacements between objects in space are represented with real numbers.

You make a good and interesting point as to whether the actual structure of the reals, which is, as you say, pretty strange, meaningfully corresponds to relative displacements in space, but this is a separate point. If we wished to be finitist, we could argue that we measure over a sparse subset of the reals or something like that or we could define various methods of putting the rational numbers to use for this purpose. But my larger point is that in the end the physical world appears to be entirely sensible to us only as relative displacements of objects in space and these appear to map to something very like the real numbers.

In fact, physics at its most basic is encoded in these terms as well, where any system is conceptualized as being encoded by its q's, which correspond to relative displacements, and the generators of their motion, roughly speaking either the time derivatives of the q's or their conjugate momenta. But the q's are what we have to work with. At any instant in time we must lay our rulers out, one way or another, and then construct any other physical property of interest in terms of those displacements.


A "measurement" in the general sense (not quantum) is about amplifying or isolating some signal out of all the noise. I simply don't agree that all measurements boil down to relative displacements. Did a chemical reaction occur or not? Is it red or is it blue? Sure, in many of these measurements, something tangentially related to distance (like light wavelength) is involved, but lots of things are involved: measurements are inherently "macro" phenomena, which means many processes will be involved, not the least of which is electrochemical signaling in the brain of the measurer. It's just far too reductive to say that every quantity we measure is actually relative displacement. I could just as justifiably (or more so) say that every measurement is actually just measuring the strength of an electromagnetic field. Why is this helping? What does this have to do with which numbers "truly exist" or not?


I guess we should both read the SEP page on measurement.

https://plato.stanford.edu/entries/measurement-science/

Note that Bertrand Russel is on "team Nathan" in the sense that he thinks measurements relate to (at most) the reals.

I don't think anyone ever really measures the elecromagnetic field. We might, for example, measure the displacement of a charged object attached to a spring in an electromagnetic field. Or, if the field is changing in time we measure that displacement as a function of the position of the hands on a clock. But it is very hard for me to think of a situation where we measure something other than a distance at its most basic level. Even in a DAC we measure voltage relative to some calibrated voltage which we measured using a voltmeter which shows us our answer as a deflection in a meter.

This is particularly relevant in QM because in fact all the values we might measure are the eigenvalues of hermitian operators and they are, in fact, restricted to the real numbers.


> I don't think anyone ever really measures the elecromagnetic field. We might, for example, measure the displacement of a charged object attached to a spring in an electromagnetic field.

I don't think anyone ever really measures the displacement of a charged object attached to a spring in an electromagnetic field. We might, for example, measure the difference in strength of neuronal activation in our brain from neurotransmitters emitted in a chain traveling from the photoreceptor cells in our fovea, which are responding to differing quantities of photoisomers that have had their shape altered by the absorption of different frequencies of photons reflected differently off the needle of a gauge in the display of an instrument which is measuring the displacement of a charged object attached to a spring in an electromagnetic field.

This is what I mean when I say it's overly reductive to say a measurement is necessarily a displacement. A measurement is lots of things, and not all of them can be represented as a spatial displacement unless you really shoehorn it.

> all the values we might measure are the eigenvalues of hermitian operators and they are, in fact, restricted to the real numbers.

Is that even true? Spin is a measurable quantity, and you cannot possibly get a spin of 0.2. Most measurable quantum numbers are essentially integers (integer multiples of some conventional base). Remember this discussion is about whether complex numbers are "lower-class citizens" than real numbers in physics. If you're going for measurable quantum numbers, these are almost all counterexamples to the idea that real numbers are special, and instead hint that it's simply integers that are the only first-class citizens in physics. I also fundamentally disagree that the possible eigenvalues of hermitian operators are the sole criteria we should be using to "rank" the truthiness or realness of mathematical structures.

I deeply do not care what philosophy has to say about this, either. Philosophy as a discipline is completely incapable of determining the truth, because it is unwilling to ever reject a single idea; all it is is a giant collection of shower thoughts. Every philosophy page, including the one you linked, somewhere includes "According to Aristotle ...". If you're trying to learn evolutionary biology and you read "according to Lamarck ...", then you are not learning science, you're reading science history. Yet this is all philosophy ever can say about anything. Science curates ideas; philosophy hoards them.


The "i" is there because it is a convenient way in our system of mathematics to write out such an equation, but that really comes from the fact that complex numbers have two dimensionality. Our best understanding of the universe demands that higher dimensionality, not necessarily the imaginary-ness.

Yes a different mathematical formulation may be rewritten into this imaginary form, and thus is mathematically equivalent. But by the same logic a heliocentric system of elliptical orbits is mathematically equivalent to a geocentric system of epicycles. From one perspective there is a certain deeper meaning there - the universe has no absolute reference frame; but if you view your cosmos in terms of epicycles its very difficult to develop an understanding of what drives those epicycles, namely gravity. Likewise thinking about quantum mechanics in terms of of imaginary numbers may allow for accurate calculations, but nevertheless be an intellectual stumbling block for understanding why the universe is this way.

I personally have no issue with "imaginary" numbers having real physical meaning. Our inability to process the square root of negative 1 seems more like a limitation of our ape brains than the universe, and likewise for the majority of quantum weirdness. But in throwing up my hands saying the question can not be answered, I have guaranteed that I will never find the answer even if it does indeed exist.


The phase space formulation of QM uses less complex numbers than the Schrodinger one: it models states using quasi-probability distributions, where the "probabilities" behave in all the usual ways except they can go negative. Interestingly, the classical limit of this (that is, when h goes to zero) still has negative probabilities in it.


The issue with epicycles is you need an infinite number of them to produce the actual orbits and with an infinite number of epicycles you can describe any shape. Thus it is as complex as the underlying data.

Quantum Mechanics on the other hand is incredibly constrained and therefore actually says something.


And pure ellipses as predicted by newtonian gravitation also don't line up with actual orbits perfectly. In both cases they are just models approximating reality, one of which happens to be more elegant. I don't know how anyone would be able to jump straight from epicycles to general relativity.

Quantum mechanics likewise is just an approximation of quantum field theories.


It’s not about elegance for the sake of it. The number of constants in a theory provides a meaningful point of comparison, especially if you need to increase them after an experiment.


Epicycles wasn't a theory, it was a model. It did not try to explain why the planets moved in the sky as they did, it only predicted where they'd be. Neither, for that matter, were copernican or keplerian mechanics theories. They too required unending tweaking because they also were only approximations of what was actually happening. For the first few centuries after heliocentrism was proposed, it gave worse results, and demanded more tweaking. What really won people over was that the phases of the moons of jupiter were accurately predicted by the model as well. The only way to achieve that result with epicycles was to rearrange everything to be mathematically equivalent to a heliocentric model.

You can reconstruct our modern understanding of the motion of the planets in the reference frame of a static earth and produce a mathematically equivalent path that draws out epicycles which predict the positions of planets with exactly the same accuracy as our regular formulations. You can rework the representation of the laws of gravity such that they spit out positions in this reference frame. It is an equally valid model of the cosmos, with exactly the same number of starting assumptions, it's just remarkably more complex.


It started from an actual theory based around the assumption that spherical motion was perfect. They needed 2 which did actually work for a while, eventually the most accurate model needed ~17 with people giving up on the underlying theory as the number of terms destroyed the initial idea.

Today with vastly more data and more accurate measurements you’d need effectively infinite terms, which makes it more obvious but you don’t need that level of absurdity to render judgment.


No it didn't. Epicycles were from the get go nothing but an attempt to fit a mathematical function to observed data to predict future positions of planets. It's a geometric method of curve fitting which is a weaker form of the fourier series, and the system was developed by greek mathematicians trying to improve upon Babylonian computations that didn't even have a geometric model. There is a reason that the moon, the only thing in the cosmos that does in fact orbit the earth, has the most complicated series of epicycles to describe its motion.

Ptolemy rejects Aristotle's cosmology which relied on perfect spherical motion. Ptolemy really did believe that the planets moved according to his model (ie it wasn't just a pure computational tool) but he was very clear that his model was based purely on mathematics. Not only did he not give a reason for why the cosmos should take this form, he openly speculates that the answer is unknowable, and works under the assumption "maybe they can move wherever they want and they just like moving this way."

Further, cycles were not added over time [1]. On day one there were 31 cycles and circles, and these were exactly the same ones being used at the time of Copernicus. You also don't need many epicycles to accurately produce a path identical to keplerian orbits. Completely arbitrary orbits can be described with finite epicycles. [2] Indeed the problem was that Ptolemy didn't fit the data by adding more epicycles, but instead through the Equant, which moved the positions of the centers of the epicycles, which meant adding more epicycles would not make it more accurate. The story of ever more epicycles being added to a bloated old theory that was streamlined by heliocentrism is a modern myth.

[1] https://diagonalargument.com/2025/05/20/from-kepler-to-ptole...

[2] https://web.math.princeton.edu/~eprywes/F22FRS/hanson_epicyc...


> 31 cycles and circles

That’s a count of the total need to describe the motion of multiple celestial bodies.

I’m referring to the number of cycles needed to describe the motion of a single celestial body. There wasn’t enough data at high enough precision to need 17 cycles to describe the motion of a single celestial body until much later. At the time lesser precision was more common, but that someone really did go to such an extreme to create the best fit.

> Completely arbitrary orbits can be described with finite epicycles.

The number of points isn’t fixed with continuous observations. Your best fit for past data keeps needing new cycles over time unless you’re working backwards from a much better model. Even then you run into issues with earthquakes changing the length of the day etc. The basic assumptions they where working from don’t actually hold up.

Also, I’m reasonably sure you couldn’t actually write out an infinite decimal representation of the irrational number e using a finite number of epicycles. Not something I’ve really considered deeply, but it seems like an obvious counter example.


Please read the sources I cited. You are arguing about epicycles based on a fictional story you heard about them.


I did read them.

The first is overlooking the issue of overfitting using hand calculation and imperfect observations. The calculated “best fit” for the data available did involved adding a bunch of epicycles and there was no theoretical reason to avoid doing so.

The second is playing fast and loose with a fat line drawn over a squiggly line based on a better model. It’s being mathematically rigorous but intentionally deceptive. You can fairly trivially construct a set of epicycles to fit some desired shape, but working backwards from observation there’s nothing guiding you to the most elegant possible solution for a given situation.


More complete astronomy data from telescopes showed that epicycles needed to be even more complicated then they were.

If we manage to find better tools for QM where we don't need to perform as much post-selection of experimental data, perhaps we'll also find a simpler model.


Yes, the post is focusing on the overall effect of operations (unitaries) rather than their continuous trajectories (hamiltonians acting on system via Schrodinger equation) (analogous to working with impulses rather than forces).

To make the continuous case interesting as a compilation problem, you'd need some alternate formulation of the Schrodinger equation, e.g. based on the limit of small powers of unitaries rather than on the matrix exponential, so that deleting i didn't delete literally all processes. Or you could arbitrarily declare real-only hamiltonians are permitted, despite the Schrodinger equation saying "i". But that'd be kinda lame, imo.

(Note: am author of post)


Gidney, that's you?

Huge fan of your work!

I just started my PhD in distributed quantum computing, and my Masters was applying that framework to the QFT.

I came across a number of papers you authored in the process, as well as your blog. In particular, big fan of Kahanamoku-Meyer et al.'s optimistic QFT circuit.

Anyway, keep up the great work!


I made a related comment about two years ago:

https://news.ycombinator.com/item?id=38255476


> Paris' Mordor: Châtelet.

"Worst of all, the air was full of fumes; breathing was painful and difficult, and a dizziness came on them, so that they staggered and often fell. And yet their wills did not yield, and they struggled on."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: