> the last project I encountered in a professional capacity in Python where optional type hinting was used but wasn't always accurate which was a special sort of hell.
But that's the entire purpose of optional type hinting. If the hints had to be accurate, you'd have mandatory typing, not optional hinting.
It is both of those if you use a typechecker, which is the whole reason it exists (in fact, the first popular typechecker existed before the annotation syntax using type comments; type annotations were developed specifically so that it could be accommodated in the language rather than becoming its own separate language.)
That's the problem! The code should not run if the types are wrong. Having an external tool is an antipattern.
Having to rely on process for validity is a recipe for bad. We already know how the greater python community has been with requirements.txt and dependencies. I've spent days fixing this garbage.
It's a tooling problem. Good tools make good habits part of the automation and stop you from having to think about it.
You are talking about misleading type hints, not optional ones. Optional means they don’t have to be added. Wrong typehints are so much worse than missing ones.
I think the purpose of optional type hinting is that you don't have to add it everywhere all at once, not that it doesn't have to be accurate. I guess you could split hairs and say "hint" doesn't imply perfect accuracy, but... adding a language feature that can lie really seems to have a lot of downsides vs upsides; whereas at least optional has obvious migration benefits.
You could have optional type hints where the runtime would still yell at you - maybe even from just an optional flag - if you returned a string out of a function that should return an int.
Because as-is, once you have those function that says it returns an int but returns a string instead, etc, in a big codebase, your editor tooling gets really confused and it's way worse to work through than if the hints weren't there at all.
(And there are tools in Python that you can use to inspect and verify the accuracy. But those tools are also... optional... And if you start to apply them to a codebase where they weren't used, it can be very time-consuming to fix everything...)
> And there are tools in Python that you can use to inspect and verify the accuracy. But those tools are also... optional... And if you start to apply them to a codebase where they weren't used, it can be very time-consuming to fix everything...
How is that "bad" solution different from this "good" one?
> You could have optional type hints where the runtime would still yell at you - maybe even from just an optional flag - if you returned a string out of a function that should return an int.
If it's built in to the runtime you get a lot of potential benefits:
- you don't need to install additional packages
- you could have (because you don't want to hurt prod perf by checking all the time) dev-mode with warnings by default on execution and a prod-mode where they're ignored
- you can then have people in the dev environment catching things as they write/run/test their code vs only whenever they run the third party tool (which it seems a lot of people don't set up for even run-on-every-commit)
Let's flip the question around! What do you think are the benefits to making it easy to add misleading incorrect type hints?
Going by wikipedia, the incubation period can be up to three months. That isn't a particularly significant span of time if we're measuring how likely someone is to suffer an unexpected death. It's long enough that the possibility exists, but that's about all you can say.
But the question isn't "what are the odds someone who dies in this period has rabies" it's "what are the odds someone who died after being infected with rabies died before they started showing symptoms" so the rarity of people incubating rabies is irrelevant.
Further, rabies incubation is highly variable - symptoms may not appear for years.
> while Gilbert said he enjoyed Vampire Survivors, he added that the game’s style was “a little too much ‘ADHD’ for me. I look at those games and it’s like, wow, I feel like I’m playing a slot machine at some level. The flashing and upgrades and this and that… it’s a little too much.”
Vampire Survivors was designed by a guy whose job was coding slot machines.
It's weird. The "records" in question appear to be those kept by the Smithsonian:
> The Smithsonian Institution’s Global Volcanism Program has no record of any eruptions of Hayli Gubbi during the Holocene, the current geological epoch, which began at the end of the last ice age, about 11,700 years ago.
But I'm fairly confident that the Smithsonian's records don't go back more than 700 years.
unsigned add(unsigned x, unsigned y) {
unsigned a, b;
do {
a = x & y; /* every position where addition will generate a carry */
b = x ^ y; /* the addition, with no carries */
x = a << 1; /* the carries */
y = b;
/* if there were any carries, repeat the loop */
} while (a);
return b;
}
It's easy to show that this algorithm is correct in the sense that, when b is returned, it must be equal to x+y. x+y summing to a constant is a loop invariant, and at termination x is 0 and y is b.
It's a little more difficult to see that the loop will necessarily terminate.
New a values come from a bitwise & of x and y. New x values come from a left shift of a. This means that, if x ends in some number of zeroes, the next value of a will also end in at least that many zeroes, and the next value of x will end in an additional zero (because of the left shift). Eventually a will end in as many zeroes as there are bits in a, and the loop will terminate.
Alive2 does not handle loops; don't know what exactly it does by default, but changing the `shl i32 %and, 1` to `shl i32 %and, 2` has it still report the transformation as valid. You can add `--src-unroll=2` for it to check up to two loop iterations, which does catch such an error (and does still report the original as valid), but of course that's quite limited. (maybe the default is like `--src-unroll=1`?)
Alternately: Self-made wealth is so frequently derived from evil that those who are born rich and come good owe penance for the sins of their forebears.
Yeah I'm not convinced Microsoft can do software anymore. I think they're a shambling mess of a zombie software company with enough market entropy to keep going for a long time.
They did engineer a good browser: original Edge with the Chakra JavaScript Engine. It was faster than Google Chrome and had some unique features: a world-best, butter-smooth and customizable epub reader. I loved it for reading - it beat commercial epub readers - and then Nadella took over and said Microsoft is getting rid of it and Edge will move to Chromium and Microsoft will also get rid of Windows phone. Modern Microsoft will be Cloud/AI and Ads. That was so depressing.
You could say the same thing about all Microsoft products then. How many full time developers does it take to support Windows 11 when Linux is available, SqlServer when Postgres is available, Office when LibreOffice exists?
And so on all under licenses that allows Microsoft do whatever it wants with?
They should be embarrassed to do better, not spin it into a “wise business move” aka transfer that money into executive bonuses.
Microsoft gets a lot of its revenue from the sale of licenses and subscriptions for Windows and Office. An unreliable source that gives fast answers to questions tells me that the segments responsible for those two softwares have revenue of about $13 and about 20 billion per quarter respectively.
In contrast, basically no one derives any significant revenue from the sale of licenses or subscriptions for web browsers. As long as Microsoft can modify Chromium to have Microsoft's branding, to nag the user into using Microsoft Copilot and to direct search queries to Bing instead of Google Search, why should Microsoft care about web browsers?
It gets worse. Any browser Microsoft offers needs to work well on almost any web site. These web sites (of which there are 100s of 1000s) in turn are maintained by developers (hi, web devs!) that tend to be eager to embrace any new technology Google puts into Chrome, with the result that Microsoft must responding by putting the same technological capabilities into its own web browser. Note that the same does not hold for Windows: there is no competitor to Microsoft offering a competitor to Windows that is constantly inducing the maintainers of Windows applications to embrace new technologies, requiring Microsoft to incur the expense of applying engineering pressure to Windows to keep up. This suggests to me that maintaining Windows is actually significantly cheaper than it would be to maintain an independent mainstream browser. An independent mainstream browser is probably the most expensive category of software to create and to maintain excepting only foundational AI models.
"Independent" here means "not a fork of Chromium or Firefox". "Mainstream" means "capable of correctly rendering the vast majority of web sites a typical person might want to visit".
They did incur that cost… for decades. They were in a position where their customers were literally forced to use their product and they still couldn’t create something people wanted to use.
I thought rabies had to be considered for any interaction with a wild animal. For one showing atypical behavior, the indication is that much stronger.
reply