It's fascinating that the biggest CRT ever made had a 43" diagonal, which is at the low end for modern flatscreen TVs. But yeah, I can see why the market for this beast was pretty limited: even with deinterlacing, SD content would have looked pretty awful when viewed from up close, so the only application I can think of was using it for larger groups of people sitting further away from the screen. And even for that, a projector was (probably?) the cheaper alternative...
In the late aughts I worked a summer at a company that was designing an articulating (flat screen) TV mount. I went with the engineers to one of the Intertek testing sessions. We wanted it to be rated for a 60" TV, but I was given the impression that the weight formulas they used for testing were based on CRT screens. The salesperson who came with us was giddy seeing the thing loaded up with 1000lb of steel plates and not giving way, but the actuators could not lift and our advertised rating was not more than 200lb.
I remember having the 36" version in ~1997. I wouldn't want to guess how much it weighed, it was insane. I remember how impressive it was watching the Fifth Element Laserdisc on it.
I had the first high-def Sonys in the US market. I worked at a high end audio video store in the mid 90s and they gave it to me cheap as they couldn't get rid of it.
Even at 34", the thing weighed 200lbs (plus the stand it came with). I lived in a 3rd floor walk up. I found out who my true friends were the day we brought it back from the store. I left that thing in the apartment when I moved. I bet it is still there to this day.
I'd forgotten how heavy CRTs are. A local surplus auction has a really tempting 30's inch Sony CRT for sale cheap, but when I saw it was over 300lbs I had to pass on it.
A lot of those CRT screens had a pretty low refresh frequency, you were basically sitting in front of a giant stroboscope. That was particular bad for computer screens where you were sitting right in front of them. I think they pretty much all displayed at 30Hz. I can imagine how a gigantic screen can get pretty uncomfortable.
I recall a lot of people playing counterstrike at 640x480 to get at 100+hz refresh rates. The lower the resolution, the faster you can refresh. I don't recall the absolute limit but it would give the latest LCD gaming panels a serious run for their money.
If you pay extra for that. Meanwhile _any_ CRT could trade off resolution for refresh rate across a fairly wide range. In fact the standard resolutions for monitors were all just individual points in a larger space of possibilities. They could change aspect ratio as well. This can be quite extreme. Consider the 8088 MPH demo from a few years back (<https://trixter.oldskool.org/2015/04/07/8088-mph-we-break-al...>). See the part near the end with the pictures of 6 of the authors? That video mode only had 100 lines, but scrunched up to make a higher resolution.
Well, we are discussing a CRT TV that was $40k new a life time ago, so perhaps the fact that it costs $599 to get a 480Hz OLED today is not a consideration. To the point though: it is a fallacy to believe that CRTs could arbitrarily shape their resolution. While the input signal could cover a wide range of possible resolutions and refresh rates depending on the bandwidth supported, the existence of apperture grilles or shadow masks imposed a fixed digital reality that limited the maximum possible resolution to much lower values than the typical 4k panels that we have today. The "pixels" didn't become larger on lower resolutions: they just covered more dots on the mask. We can get much better results today with scaling than we ever could on CRTs, as awesome a technology as they were 40 years ago.
Sure, but 99% of that cost was paying for the absurd physical dimensions of that particular television.
> The "pixels" didn't become larger on lower resolutions…
Strictly speaking, the CRT only had discrete lines not pixels. Within a line the color and brightness could change as rapidly or slowly as the signal source desired. It was in fact an analog signal rather than a digital one. This is why pixels in many display modes used by CRTs were rectangular rather than square.
> We can get much better results today with scaling than we ever could on CRTs…
I say it’s the other way around! No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels. You would have to have a display built with just the right size and shape of pixel to do that, and then it wouldn’t be any good for displaying modern video formats.
Seems irrelevant to bring up cost for something that is streamline-priced today, but sure, let's move on.
> Strictly speaking, the CRT only had discrete lines not pixels.
The electron gun moves in an analog fashion, but when it hits the glass surface, it can only go through specific openings [1]. These openings are placed at a specific distance apart [2]. This distance specifies the horizontal, digital, max CRT resolution.
> No ordinary flat–panel display can emulate the rectangular pixels of the most common video modes used on CRTs because they are built with square pixels.
Today's panels have achieved "retina" resolution, which means that the human eye cannot distinguish individual pixels anymore. The rest is just software [3].
Yes and no. Half of the screen was refreshing at a time, so it was really flashing at 30Hz. You still had a visible stroboscopic effect. True 60Hz and 100Hz screen appeared in the late 90s and made a visible difference in term of comfort of viewing.
CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.
CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.
For some reason I remember 83Hz being the highest refresh rate supported by my XGA CRT, but I think it was only running at SVGA (800x600) in order to pull that rate.
Some demos could throw pixels into VRAM that fast, and it was wild looking. Like the 60Hz soap-opera effect but even more so.
I still feel that way looking at >30fps content since I really don't consume much of it.
> some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.
400p at 70 Hz was the default resolution of the VGA, pretty much all the classic mode 13h games ran at 70 Hz.
The only time the electron gun was not involved in producing visible light was during overscan, horizontal retrace, and the vertical blanking interval. They spent the entire rest of their time (the very vast majority of their time) busily drawing rasterized images onto phosphors (with their own persistence!) for display.
This resulted in a behavior that was ridiculously dissimilar to a 30Hz strobe light.
Did they really do that, or did the tubes just ran at 2x vertically stretched 640x240 with vertical pixel shift? A lot of technical descriptions of CRTs seem to be adapted from pixel addressed LCDs/OLEDs, and they don't always seem to capture the design well
The limiting factor is the horizontal refresh frequency. TVs and older monitors were around 15.75kHz, so the maximum number of horizontal lines you could draw per second is around 15750. Divide that by 60 and you get 262.5, which is therefore the maximum vertical resolution (real world is lower for various reasons). CGA ran at 200 lines, so was safely possible with a 60Hz refresh rate.
If you wanted more vertical resolution then you needed either a monitor with a higher horizontal refresh rate or you needed to reduce the effective vertical refresh rate. The former involved more expensive monitors, the latter was typically implemented by still having the CRT refresh at 60Hz but drawing alternate lines each refresh. This meant that the effective refresh rate was 30Hz, which is what you're alluding to.
But the reason you're being downvoted is that at no point was the CRT running with a low refresh rate, and best practice was to use a mode that your monitor could display without interlace anyway. Even in the 80s, using interlace was rare.
Interlace was common on platforms like the Amiga, whose video hardware was tied very closely to television refresh frequencies for a variety of technical reasons which also made the Amiga unbeatable as a video production platform. An Amiga could do 400 lines interlaced NTSC, slightly more for PAL Amigas—but any more vertical resolution and you needed later AmigaOS versions and retargetable graphics (RTG) with custom video hardware expansions that could output to higher-freq CRTs like the SVGA monitors that were becoming commonplace...
CGA ran pretty near 262 or 263 lines, as did many 8-bit computers. 200 addressable lines, yes, but the background color accounted for about another 40 or so lines, and blanking took up the rest.
The irony is that most of those who downvote didn't spend hours in front of those screens as I did. And I do remember these things were tiring, particularly in the dark. And the worst of all were computer CRT screens, that weren't interlaced (in the mid 90s, before higher refresh frequency started showing up).
I spent literally thousands of hours staring at those screens. You have it backwards. Interlacing was worse in terms of refresh, not better.
Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.
With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.
And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.
Have you ever seen high speed footage of a CRT in operation? The phosphors on most late-80s/90s TVs and color graphic computer displays decayed instantaneously. A pixel illuminated at the beginning of a scanline would be gone well before the beam reached the end of the scanline. You see a rectangular image, rather than a scanning dot, entirely due to persistence of vision.
Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE
Interesting. It's basically a compromise between flicker and motion blur, so I assumed they'd pick the phosphor decay time based on the refresh rate to get the best balance. So for example, if your display is 60 Hz, you'd want phosphors to glow for about 16 ms.
But looking at a table of phosphors ( https://en.wikipedia.org/wiki/Phosphor ), it looks like decay time and color are properties of individual phosphorescent materials, so if you want to build an RGB color CRT screen, that limits your choices a lot.
Also, TIL that one of the barriers to creating color TV was finding a red phosphor.
There are no pixels in CRT. The guns go left to right, ¥r¥n, left to right, while True for line in range(line_number).
The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took.
The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so.
Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least.
No, you don't sacrifice refresh rate! The refresh rate is the same. 50 Hz interlaced and 50 Hz non-interlaced are both ~50 Hz, approx 270 visible scanlines, and the display is refreshed at ~50 Hz in both cases. The difference is that in the 50 Hz interlaced case, alternate frames are offset by 0.5 scanlines, the producing device arranging the timing to make this work on the basis that it's producing even rows on one frame and odd rows on the other. And the offset means the odd rows are displayed slightly lower than the even ones.
This is a valid assumption for 25 Hz double-height TV or film content. It's generally noisy and grainy, typically with no features that occupy less than 1/~270 of the picture vertically for long enough to be noticeable. Combined with persistence of vision, the whole thing just about hangs together.
This sucks for 50 Hz computer output. (For example, Acorn Electron or BBC Micro.) It's perfect every time, and largely the same every time, and so the interlace just introduces a repeated 25 Hz 0.5 scanline jitter. Best turned off, if the hardware can do that. (Even if it didn't annoy you, you'll not be more annoyed if it's eliminated.)
This also sucks for 25 Hz double-height computer output. (For example, Amiga 640x512 row mode.) It's perfect every time, and largely the same every time, and so if there are any features that occupy less than 1/~270 of the picture vertically, those fucking things will stick around repeatedly, and produce an annoying 25 Hz flicker, and it'll be extra annoying because the computer output is perfect and sharp. (And if there are no such features - then this is the 50 Hz case, and you're better off without the interlace.)
I decided to stick to the 50 Hz case, as I know the scanline counts - but my recollection is that going past 50 Hz still sucks. I had a PC years ago that would do 85 Hz interlaced. Still terrible.
I think you are right, I had the LC III and Performa 630 specifically in mind. For some reason I remember they were 30Hz but everthing I find googling it suggest they were 66Hz (both video card and screen refresh).
That being said they were horrible on the eyes, and I think I only got comfortable when 100Hz+ CRT screens started being common. It is just that the threshold for comfort is higher than I remember it, which explains why I didn't feel any better in front of a CRT TV.
Could it be that you were on 60Hz AC at the times? That is near enough to produce something called a "Schwebung" when artificial lighting is used. Especially when using flourescent lamps like they were common in offices. They need to be "phasenkompensiert" (phase compensated?/balanced), meaning they have to be on a different phase of the mains electricity, than the computer screens are on. Otherwise even not so sensitive people notice that as interference/sort of flickering. Happens less when you are on 50Hz AC, and the screens run at 60Hz, but with flourescents on the same phase it can still be noticeable.