Plz stop to worry. They only need that to help you around in the future, not to do any harm. Imagine you love their non-free offers later on, but don't know how to contact them to share credit card number - what a disruption on your end.
100%. What ppl often mixup is shell responsibility and the terminal itself. They are not really to blame for that, because their interactive terminal usage is mostly at a shell prompt.
Idk how they implemented things here, prolly by sidechanneling information to/from the terminal widget. Which is bad, they'd better use in-band terminal sequences for that, so the ecosystem of TEs and cmdline apps can eval the sequences and incorparate them, where they are useful (like the prompt marking OSC sequences...)
Very true, the Amish lifestyle alone debunks most of the telling above as heavily skewed. Implying certain society values and a way of thinking as natural or god-given, while in fact it only summarizes, whats currently pretty famous as western lifestyle.
> Maybe I am naive, but do Amish typically worry about money and about materialistic things?
Imho everyone worries about materialistic things to a certain degree, as a potatoe for example is very materialistic thing and we cannot eat virtual goods (yet). The potatoe example can also be used illustrate another aspect - all materialistic things will decay over time, thus naturally ppl would not hoard tons of goods beside what they need for themselves or for trading for other goods (which will also decay, so it makes no sense to overly hoard traded goods either).
This did not change much throughout history. Also there was only one outstanding exception to that naturalistic equilibrium - land. Even old bronze age societies already fought for fertile land as thats the very base to sustain life and furthermore it provides access to other resources like ore and such, that we learned to process into other goods. So history tells us - yes thats possible, also the Amish kinda still live in that tradition.
What really changed that almost-equilibrium was the invention of the modern money with the bank system. The big difference now is, that this virtual good "money" does not decay naturally (given the economy is stable). Instead the system even boost itself by constantly creating more money de-valuating itself. Most of us love to see the numbers increasing, because we tend to believe we can buy more things from that. But later we realize that goods just get a higher price tag instead (inflation), which furthr boost the money hunger.
Here we are - the big hoarding tendency in western societies has a lot to do with our money system.
Wow, so much discomfort expressed here about slow scrolling. Are we all adopted to read tons of pages of terminal output flying by? I'm certainly not, and XOFF doesnt work anymore in many setups. Pagers for the rescue, though not working everywhere...
Is it just me or does the article read a lot like disguised paid promotion? I have no problem with paid promotions at all, but rather would like to see a disclaimer then.
Hah, 100% not a paid promotion, although it is kinda awesome that you think it is! We were thrilled with the coverage (our first article!) after we soft launched on /r/opensource.
Its true, that a browser engine has a higher resource usage initially - it can easily hog 50-100MB without any content being loaded, but seriously, thats not much of a concern anymore these days. On the other hand it provides a great programming environment for web devs with many bells and whistles (which arguably might not even be needed for certain electron apps).
What concerns me more is what web devs actually do within that environment - it's almost like many devs have lost a basic sense of resource management, like memory comes for free, gets garbage collected anyway, so why should I care.
Many electron apps are overly shiny in appearance, backed by tons of graphics and animations, but in fact are lousy managed under the hood. I am not quite sure if this is a direct effect of using a GC language, or if Javascript+HTML in particular makes it too easy to get quite far without some comprehension of inner works of the browser and the machine underneath.
This is also a pain with modern webpages - I really dont get it, why a browser with 8 tabs opened can easily grow to several GBs after a while.
There are a few counter examples - electron apps that do heavy lifting while still being nice to your RAM and CPU. Sadly thats exceptional, not the common case. I tend to blame sloppy web devs here.
Wave uses electron for rendering, but a Golang backend for networking and heavy lifting. This gives Wave the ability to be cross platform and still efficient under the hood.
I totally understand choosing a HTML engine for output rendering, given the type of additional content you want to provide on top of normal terminal stuff. This would be really painful and expensive to implement & maintain in native GUI libs across several platforms. And vscode kinda shows, that it is possible to maintain a reasonable resource footprint while delivering a productive app with electron. They also achieve that with a relatively small team, which prolly would not have worked out with platform native solutions.
May I ask how you drive the terminal emulation? Is this done by xterm.js or a custom terminal emulator? I am asking, because xterm.js put quite some effort into optimizing things to be on par with other fast desktop terminal emulators, speed and memory-wise.
Keyboarders are prolly the ppl with fastest trained finger action on earth, be it on a computer keyboard or a music keyboard. In digital music production there is a magic threshold of 15-20ms latency for input signals - if a signal takes longer to process through your DSPs/computers lineup, any good musician will start to perceive the latency (btw thats more like an on/off phenomenon). Below that range we cannot detect any latency chance anymore, as our neural system is not made for higher time resolution (also the reason why 60 FPS gives us the illusion of a movie).
Typing latency on modern computers is really high compared to electronic or semi-digital typewriters of the 80s - those old machines had very little circuitry between the keypress and some output action, they were like hard-wired realtime machines to some degree.
On today's machines there are tons of buffers, clocks, context switches etc. to pass, with no realtime promises anymore (at least not on typical consumer OS). We really have a "long line" in our machines today.
Electron adds another buffer stack to this madness - the browser engine needs to get the PTY chunks somehow, which is often done with websockets as IPC. Which means put chunks into a http frame, send through network stack to the electron browser engine, decode there - and finally the terminal sees the data as input. A desktop TE does not have to go through that, it can simply write the PTY chunk to its data structures in the same process. Thats the real overhead happening for data IO intensive electron apps regarding input latency.
I tend to disagree. The fact, that a browser engine and JS is involved raises the bar significantly during security audits. It brings additional third-party players into the chain you have to trust. Meanwhile I suggest to read https://xtermjs.org/docs/guides/security/.
Imho there is some misconseption here, how the sixel protocol works for the colors part. It is not about any bit-depth of an image, as it is paletted. But other than most standard paletted formats, it can redefine its colors on-the-fly, which basically makes it supporting an infinite amount colors (it is still limited to 101³ colors in RGB space).
Now the actually tricky parts: The spec states (DEC STD 070), that the palette should be of size 256 at least on decoder side, and an encoder should not create more than 256 colors, and any higher palette index gets mapped back (not specced out, but mostly done with modulo). Older devices (old DEC printers or VTs) were even more limited in palette regards, from monochrome like VT240 to 16 colors on a VT340 for screen output. Thats what xterm does with `-ti 340` and it is totally right about it - it emulates what a VT340 was capable to do.
What mlterm started to do (and others followed) - well it did not care for strict VT340 emulation, and increased the palette to 256 colors. Furthermore it applied "printer-behavior" (note - sixel was developed as printer protocol) deviating from DECs VT behavior by immediately applying colors to pixels. While DEC's sixel capable VTs always were bound to the fixed palette. That terminal vs printer behavior distinction is important, as it opens sixel to actually use more colors than the palette has room for by redefining color slots on the fly.
I figured something like that was the case, thank you for the explanation. Still that seems like a bad hack that terminals are not going to implement because it goes off-spec. It seems as if iTerm or kitty protocols (or anything else designed for a real screen and not a printer) would be a much better choice for a terminal trying to choose.
Well it is not that bad, imho all newer terminal implementations, that dont try to strictly emulate a VT340, do sixel this way. libsixel even propagates this as "highcolor". But again, sixel is still limited to ~1M colors in RGB and ~3M colors in HSL, even with that implementation trick.
The sixel format has much bigger issues beside its reduced color resolution - no alpha channel, need for really expensive quantization and printer head movements serialization, with bad cache locality due to its 6-pixel offset in y direction. Its compression is lousy. All that said, encoding/decoding sixels is a mainly CPU-bound resource hungry task with high bandwidth needs - all for worse quality compared to modern formats. With modern hardware, where beefy GPUs exist, it is really a shame to insist on using this format (which was effectively dead for >20ys).
On terminal side there are more issues about sixels and how they relate to cursor advance and the terminal grid, but going into these details will only bore ppl in a rant thread.