This is interesting to me, because I see this kind of comment on almost every Zed post.
I haven't used a low-DPI monitor for like... not sure, but more than a decade, I'm pretty sure, so for me the weird blocker I have with Zed is the "OMG YOU HAVE NO GPU!!!! THIS WILL NOT END WELL!" warning (I run a lot of Incus containers via RDP, and they mostly have no GPU available).
But what kind of monitors are you low-DPI people using? Some kind of classic Sony Trinitron CRTs, or what? I'm actually curious. Or is it not the display itself, but some kind of OS thing?
Depending on the definition I'm not a low DPI user myself, but in my friend group, I seem to be the only person who cares about >160 dpi, lots of people are using 1440p displays, or >34" 4k displays. In Apple's mind, high dpi (eg retina) is > 218 dpi, so my lowly 34" 5120x2160 doesn't count for them. But it is > 160 which is my personal threshold for hi dpi.
There aren't all that many >20" displays on the market that meet Apple's definition of high dpi, and not a ton more that meet my much looser definition.
I have a 4-5 year old ultra wide monitor which is a lot of pixels but low dpi. I really like the single monitor containing two screens worth of pixels, but I wish it was high dpi. At the time there weren’t really high dpi ultra wides available, and they’re still expensive enough that upgrading isn’t a high priority for me… but I’m sure I will at some point.
Mine is 2560x1440 which is a pretty nice "sweet spot" size. A comparable 5k to 6k display still commands a substantial price, and - given that I work at two locations - would need me to have two of them. The screen I use as my current (a 3x2 BenQ) also has some amount of subsampling going on, because running it at 2x ("Retina native HiDPI") all the UI controls are too damn big, and space is not enough. Running it at 1x (everything teeeny-tiny) is just not very good for my eyesight and not very workable - and, again, with Zed bumps into the same broken antialiasing rasterizer they have.
And it is not an OS thing. The OS renders subpixel antialiased fonts just fine. But Zed uses its own font rasterizer, and it completely falters when faced with a "standard passable resolution" screen - the letters become mushy, as if they have been blurred - and rather sloppily at that.
Linux and Windows are significantly better for both 1440p and 4k monitors. Both Linux and Windows have subpixel rendering and configurable font hinting for 1440p. And they both have fractional scaling UIs for 4k. macOS on the other hand only really looks acceptable on a 5k monitor.
When people says things like "mine is 2560x1440" on HN, are they talking about the mac scaled resolution? I feel like some context is always missing from resolution discussions, and it's a topic non-technical people can weigh in on as well.
The 2560x1440 is QHD which is kind of a happy medium: high resolution enough to look really sharp, but not so high resolution that you have to scale it up like Macs do on retina displays. Having had retina Macs (and been very happy with them) since they came out, I've been using 16" and 17" QHD panels on my linux laptops for about five years... and they are actually just fine.
I actually don't understand what I'm missing. I'm using two old monitors, a 27" at 2560x1440 and a 23.5" at 1920x1080 (in addition to my high DPI Framework 13 screen). How else can I get at least 4480 across (after scaling to a font size I can read - I'm 49) and still cover that many inches? My DPI right now is about 100, so to double that, wouldn't I need 8960 across 44 inches? I don't really want to pay $1500 for resolution my eyes are probably too old to notice.
It’s okay eyes are just different. I personally enjoy 220DPI, but 60Hz looks absolutely fine. However at the workplace enough people complain about 60Hz that all the monitors at work are 120Hz. I don’t notice any additional smoothness at all so it’s all wasted on me.
Typical DPIs are still all over the place depending on the demographic. Macs have been ~200dpi forever, while cheap PCs are still mostly ~100dpi, and decent PC setups tend to land somewhere in the middle with ~150dpi displays which are pretty dense but not up to Mac Retina standards. Gamers also strongly favor that middle-ground because the ultra-dense Mac-style panels tend to be limited to 60hz.
Zed started out as a Mac-only app, and that's reflected in the way their font rendering works.
I guess that makes sense. I'm a 280ppi convert, so I judge Mac users with pity — Linux and Windows work perfectly with my 31.5" 8K display (from fuckin' 2017 btw...) but Macs can only drive it at 6K, which adds a fuzz factor.
Unless you use it at 4K, but macOS isn't really usable that way (everything way too small).
But yeah, it's 60Hz. Which has sucked ever since I accidentally got a 120Hz display, so now 60 Hz looks like 30Hz used to...
I had a chance to try that LG 45GX950A-B at Yodobashi Camera in Akihbara the other day, and... that measly 125ppi might overperform at the distance you have to put it at. But then again my 50-year-old eyeballs are starting to be like "anyway you need your glasses bro" so... YMMV
What does that mean? If the monitor only requires 15W to operate, that's a good thing, right? Unless monitors are expected to use less than that? I'm not familiar with reading monitor spec sheets.
to add on to what jsheard said, for this feature to be usable (ie, charge your laptop just by plugging in the monitor), you need this number to be about what your laptop's charger is. At 15W, even a macbook air would run out of power slowly while plugged into this monitor, assuming you don't plug a second cable into your laptop. 65W or 90W is a much more normal value for a feature like this.
That all makes sense. The only thing I was missing was that this refers to power output. It seems like kind of a niche and tenuous value-add for a monitor. Why would I want to get power from my monitor?
Both at work and at home, I can plug in my monitor to my laptop with a single cable to my monitor. That single cable charges my laptop, connects the display, and passes through a usb hub that's built into the monitor that connects my keyboard and webcam. It's _incredibly_ convenient. It's also just a lot less cabling. You can think of it like a dock, built into the monitor for free.
> It seems like kind of a niche
Different workflows/circles. It's not something you're likely to use with a desktop, mainly with a laptop. It also really only works well if you use thunderbolt. It's reasonably common but probably not a majority where I work, where 90% of dev machines are macs.
I haven't used a low-DPI monitor for like... not sure, but more than a decade, I'm pretty sure, so for me the weird blocker I have with Zed is the "OMG YOU HAVE NO GPU!!!! THIS WILL NOT END WELL!" warning (I run a lot of Incus containers via RDP, and they mostly have no GPU available).
But what kind of monitors are you low-DPI people using? Some kind of classic Sony Trinitron CRTs, or what? I'm actually curious. Or is it not the display itself, but some kind of OS thing?