Crowing over omitting Arabic numerals in the name of avoiding any kind of cultural influence or bias seems silly when everything is still going to be expressed in decimal.
And in hours, minutes and seconds. Yes, those are near universal (I guess there may still be a few tribes in the Amazon or Indonesia that don’t use them), but still are cultural.
Also, having each hour have the same duration is cultural. The Romans, for example, at some time, used a clock where “the period of the natural day from sunrise to sunset was divided into twelve hours” (https://en.wikipedia.org/wiki/Roman_timekeeping). That means hours were shorter in winter than in summer.
The author wanted it to retain some practical value, hence the discussion of the four “layers” of time—departing from the 12-hour system completely, even if there is a better way to represent time outside of it, would make the clock difficult to use.
<quote>The answer turned out to be geometry.</quote>
The answer turned out to be creating a cool piece of digital art that's essentially unusable as a timepiece. Don't get me wrong, it's a cool art project, but it's not a practical timepiece.
C++ module implementation is a story with a lot of drama, if you ever want to read up on it.
The short summary, though, is that no toolchain yet has a bulletproof implementation, though everybody at least has enough to let people kick the tires a bit.
FWIW, I still think the Google Style Guide banning UDLs entirely is too harsh. I think they should be used sparingly, but there are cases where they make sense.
In Chromium's UI code, we have a lot of APIs that deal with coordinates, which may be in either px or dp. And we have a lot of code that needs to hardcode various constants, e.g. layout values for different dialogs. IMO, it's sane to have UDL support here, e.g. `20_px` (at least if we had separate types to represent these two things, which we don't... don't get me started).
Agree. I love UDLs, use them in my code for improved legibility. I hadn’t realized how badly you can screw up if you don’t define all the operators correctly.
At least in Chromium that wouldn't help us, because we disable strict aliasing (and have to, as there are at least a few core places where we violate it and porting to an alternative looks challenging; some of our core string-handling APIs that presume that wchar_t* and char16_t* are actually interconvertible on Windows, for example, would have to begin memcpying, which rules out certain API shapes and adds a perf cost to the rest).
When I led C++ style/modernization for Chromium, I made this argument frequently: we should prefer the stdlib version of something unless we have reason not to, because incoming engineers will know it, you can find advice on the internet about it, clang-tidy passes will be written for it, and it will receive optimizations and maintenance your team doesn't have to pay for.
There are cases, however, when the migration costs are significant enough that even those benefits aren't really enough. Migrating our date/time stuff to <chrono> seemed like one of those.
It's weird to me, as the former lead maintainer of this page for ten years or so, that this got submitted to both r/c++ and HN on the same day. Like... what's so exciting about it? Was there something on the page that caught someone's eye?
A cursory Chromium code search does not find anything outside third_party/ forcing either signed or unsigned char.
I suspect if I dug into the archives, I'd find a discussion on cxx@ with some comments about how doing this would result in some esoteric risk. If I was still on the Chrome team I'd go looking and see if it made sense to reraise the issue now; I know we had at least one stable branch security bug this caused.
reply