Hacker Newsnew | past | comments | ask | show | jobs | submit | pkasting's commentslogin

Crowing over omitting Arabic numerals in the name of avoiding any kind of cultural influence or bias seems silly when everything is still going to be expressed in decimal.

And in hours, minutes and seconds. Yes, those are near universal (I guess there may still be a few tribes in the Amazon or Indonesia that don’t use them), but still are cultural.

Also, having each hour have the same duration is cultural. The Romans, for example, at some time, used a clock where “the period of the natural day from sunrise to sunset was divided into twelve hours” (https://en.wikipedia.org/wiki/Roman_timekeeping). That means hours were shorter in winter than in summer.


I was also confused why the outer ring must only show zero or one, and then it struck me — twelve hour system.

The author wanted it to retain some practical value, hence the discussion of the four “layers” of time—departing from the 12-hour system completely, even if there is a better way to represent time outside of it, would make the clock difficult to use.

I struggle to see how this clock would be less practical to use with a 24 hour system

Or if we want to keep the numbers small to make them easier to read in this clock's numerals, why not a 6 hour system?


<quote>The answer turned out to be geometry.</quote> The answer turned out to be creating a cool piece of digital art that's essentially unusable as a timepiece. Don't get me wrong, it's a cool art project, but it's not a practical timepiece.

If you want to see another cool artpiece clock, there's the one in the Berlin Europa Centre, https://www.youtube.com/watch?v=jTUiWYLXD9g.


C++ module implementation is a story with a lot of drama, if you ever want to read up on it.

The short summary, though, is that no toolchain yet has a bulletproof implementation, though everybody at least has enough to let people kick the tires a bit.


Yeah, maintainers would certainly +1 a CL that added a note about the parts of //base to use instead. Trivial oversight.


FWIW, I still think the Google Style Guide banning UDLs entirely is too harsh. I think they should be used sparingly, but there are cases where they make sense.

In Chromium's UI code, we have a lot of APIs that deal with coordinates, which may be in either px or dp. And we have a lot of code that needs to hardcode various constants, e.g. layout values for different dialogs. IMO, it's sane to have UDL support here, e.g. `20_px` (at least if we had separate types to represent these two things, which we don't... don't get me started).


Agree. I love UDLs, use them in my code for improved legibility. I hadn’t realized how badly you can screw up if you don’t define all the operators correctly.


Yes, reading the actual data would still be UB. Hopefully will be fixed in C++29: https://github.com/cplusplus/papers/issues/592


At least in Chromium that wouldn't help us, because we disable strict aliasing (and have to, as there are at least a few core places where we violate it and porting to an alternative looks challenging; some of our core string-handling APIs that presume that wchar_t* and char16_t* are actually interconvertible on Windows, for example, would have to begin memcpying, which rules out certain API shapes and adds a perf cost to the rest).


When I led C++ style/modernization for Chromium, I made this argument frequently: we should prefer the stdlib version of something unless we have reason not to, because incoming engineers will know it, you can find advice on the internet about it, clang-tidy passes will be written for it, and it will receive optimizations and maintenance your team doesn't have to pay for.

There are cases, however, when the migration costs are significant enough that even those benefits aren't really enough. Migrating our date/time stuff to <chrono> seemed like one of those.


It's weird to me, as the former lead maintainer of this page for ten years or so, that this got submitted to both r/c++ and HN on the same day. Like... what's so exciting about it? Was there something on the page that caught someone's eye?


The majority of things Chromium bans would still get banned in green-field use.

Some notable exceptions: we'd have allowed std::shared_ptr<T> and <chrono>. We might also have allowed <thread> and friends.


This bit us in Chromium. We at least discussed forcing the compiler to use unsigned char on all platforms; I don't recall if that actually happened.


I recall that google3 switched to -funsigned-char for x86-64 a long time ago.


A cursory Chromium code search does not find anything outside third_party/ forcing either signed or unsigned char.

I suspect if I dug into the archives, I'd find a discussion on cxx@ with some comments about how doing this would result in some esoteric risk. If I was still on the Chrome team I'd go looking and see if it made sense to reraise the issue now; I know we had at least one stable branch security bug this caused.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: