Wow, even if it wasn't so fast, I'd be tempted to use this solely due to their support of intersection (A & B) types! This is a sore omission from the standard python typing system.
Gnome works OK with integer scaling, more granular than this and you're up shit creek.
E.G. can you set one screen to 150% and one to 175%? (I think the answer to this is 'technically yes but then everything goes a bit blurry because they do it by rendering at 2x then downscaling')
Proper mixed dpi scaling means stuff will render pixel-perfectly instead of downscaling hacks.
Gnome is wayland, but they are very stubborn about the extensions they will/won't implement. For example, mixed dpi scaling, server side decorations, accessibility protocols all work differently on gnome or not at all.
This makes it very difficult for Wayland to evolve in a way that people want, as Gnome is the biggest player by user count.
Interested in tangent replies to this, even if such fonts are artistically unsuitable for games - are open source fonts OK for Japanese? I understand that Unicode denotes single characters for both Chinese and Japanese (and Korean outside of Hangul?) even though there are differences between how nations write these 'single' characters, so the result is a Unicode font will look like a Chinese font, or a Japanese font, but not both.
How do the big Unicode OSS fonts like Noto, Deja Vu deal with this?
IMO: quality wise, FOSS fonts are fine(n=1) - it doesn't have to be a proprietary font but do have to be the right ones of CJK.
- In cases with the Noto project, they've given up long ago on literal singular font in both name and file to cover every languages; they have bunch of variants as different fonts(TC,SC,HK,JP,KR,...), available as both individual files as well as combined files[1].
- In cases with most Latin fonts like Deja Vu - I think this is where bulk of "wrong font" problem comes from. They don't support CJK at all, and operating systems often handle fallback to system defaults for missing characters, but sometimes OS falls back into wrong CJK fonts and cause text mIshmashlnG[2] problems or show bunch of square placeholders(aka tofu). The latter is usually solved by installing Noto fonts, and former by specifying what fonts are to be used for fallbacks(anyhow it's done, it can be OS or game engine dependent).
- There are also bunch of domestically made fonts such as IPA Gothic; they usually only support the targeted one of CJK + ASCII(+ISO-8859-1). Most Japanese-English _bi_ lingual contents take this route, which had lead to curiosity by some as to why Japanese content creators appear to have unique and specific taste for fonts[3] - in fact they're just reusing the font files already in the project folder.
- For creative purposes such as games, FOSS fonts sometimes don't cut it, but that is not unique to the Japanese language.
> I understand that Unicode denotes single characters for both Chinese and Japanese (and Korean outside of Hangul?) even though there are differences between how nations write these 'single' characters, so the result is a Unicode font will look like a Chinese font, or a Japanese font, but not both.
This is "Han Unification", a terrible idea from early in the development of Unicode. The idea was so bad that the affected glyphs are now also given always-Chinese and always-Japanese Unicode points, making it possible to, for example, compare a Chinese character to a Japanese character in the same document.
But the fix exists. You can specify that you have no idea what you're trying to write by coding it as U+76F4 (直). Or you can specify a Chinese character by coding U+FAA8 (直). Or you can specify a Japanese one by coding U+2F940 (直). There isn't actually a reason you'd want U+76F4 - it's just a dead, useless unicode point - but we can observe here that my default font doesn't include a glyph for either U+FAA8 or U+2F940 even though U+FAA8 is by definition identical to U+76FA (since this is a Chinese font).
Every single time I reload the page the other characters are a broken rectangle, so experimentally it's 100% of the time I'll need that character. What I want is to not see a broken rectangle. One can't say that U+76F4 is useless when it clearly is serving a use for my computer.
OP means it's the only one that renders at all. It could be wrong but the other two are just missing-glyph boxes which are definitely wrong. Those code points appear unusable as a practical matter.
Jesus Christ, i can't even get my own package to reliably self-publish in CI without ending up with a fragile pile of twigs, I'm awed they are able to automate infection like that.
How's the latency? Latency is what makes Zoom et al painful for me now - it ruins the ability to politely interject, give confirmatiom, etc. Does Apple do a better job of this than Google/Zoom? In theory you could get 20-30ms (just spitballing numbers I used to get playing shooters!) but i've never got anywhere near that with vid conferencing.
Even so, latency-in-zoom kind of becomes an attribute of the medium and you learn to adapt. How does it feel with the Vision Pro though? The article talks about a really convincing sense of being in the same place with someone - how does latency affect that? (And does it differ based on if you're all physically in Silicon Valley or not?)
I would assume any added latency is negligible -- the sensors + interpretation + rendering should be very fast.
But you've still got all the network latency including Wi-Fi latency on both ends. And you always need a small audio buffer so discrete network packets can be assembled into continuous audio without gaps.
So I wouldn't expect this latency to be any different from regular videoconferencing.
The laws of physics means that the longer the path for your network packet, the higher the latency.
One way latency on the Internet across fiber is about 4μs to 5μs per kilometer in my experience.
For example, SF to Paris is ~40ms one way (it used to be 60ms 15y ago, latency and jitter have really improved).
Double those values for the round trip allowing you to interject in a conversation.
Add wifi, which has terrible latency with a lot of jitter (1ms to 400ms jitter is not uncommon). Wi-Fi 7 should reduce the jitter and latency in theory. We shall see improvements in the coming decade. Cellphone 5G did improve latency for me, so I don't doubt WiFi will eventually deliver.
In other words you need to be within 3Mm (3000km) away to get a chance at a 30ms roundtrip. And that's assuming peer to peer without wifi nor slow devices.
For a conference call, everybody connects to a central server acting as the relay. So now the latency budget is halved already.
> latency-in-zoom kind of becomes an attribute of the medium and you learn to adapt.
To some degree but not fully. When you adapt your brain is still doing extra work to compensate, similarly to how you don’t «hear» jet engine noise after acclimating to an airplane but it will still tire you to some degree.
I had Zoom and Teams meetings daily during Covid, and personal FaceTime calls almost daily for a while. I still get «Zoom fatigue» if a call goes on for over an hour, if I need to talk face to face during the call (i.e. no screen sharing, can’t disable video and look at something else, etc.) I’m fine if I don’t look at people’s faces but rather people’s screen sharing.
reply