The Living Computer Museum was an absolute treasure, but it shows one of the major hazards in the non-profit world. I founded the MADE, a non-profit videogame museum in Oakland. Just about 100% of my time when I was the director was spent raising money and soliciting partnerships so we could keep the doors open and the lights on. I would have killed to have someone like Paul Allen as a sponsor or involved in any way: we're a very poor museum compared to most, and money heals all wounds.
However, our little museum run with volunteers and pocket change managed to be more stable long term than the well-healed and well-funded Living Computer Museum. Why? Because having a single benefactor for a non-profit is almost a death sentence. It's super risky, and as we've seen here, when that single donor goes away, it's all over.
It's super sad the LCM is in the state it is in, but for non-profits this is a cautionary tale: you cannot build a long term non-profit on the support of a single person. The world is too chaotic for that. Even a non-profit with a billionaire behind it isn't safe. It is incredibly poor planning on humanity's part to rely on the rich to preserve our shared heritage out of some obligation or moral or ethical duty. We need to preserve this stuff on the order of the Smithsonian.
You can, but there needs to be a large amount of capital allocated to a foundation, which can then fund it in perpetuity. It’s a bit surprising that didn’t happen in this case. Feels like his heirs have different ideas for how to spend his money.
From what I understand, it was actually Paul Allen's instructions that his sister (Jody Allen) to just give everything away. Unfortunately, that didn't include running the more niche museums, movie theaters, etc...forever. It seems like smaller projects are being sacrificed in favor of larger ones (like the Music Experience museum which is still open).
Why? How lame is it to start cool projects like LCM and Cinerama and then just have it close upon your death when he had plenty of estate to fund it forever. Makes me think less of Paul Allen TBH (not that he or his relatives care)
I think he considered those hobbies to be wound down when he was no longer around to enjoy them? Ya, think less of him because of that, but he spent his money on things he enjoyed that just happened to be what other people could enjoy as well. He wasn't really thinking of his post-death legacy, and just gave simple instructions with that regard (and his sister is taking the flack for that).
Jodie Allen (his sister and only heir) is just following his instructions, from what I understand. She isn't financially benefiting from shutting down the computer museum, she is just executing his will:
> According to multiple reports Paul Allen’s will states that the Paul G. Allen Trust, which contains billions in assets, including the NFL’s Seattle Seahawks and NBA’s Portland Trail Blazers, will be liquidated upon his death and the assets used to fund his passion projects. Paul Allen died in October 2018 and the Trail Blazers have been rumored to be on the sales block for the past few years, but all’s quiet on the Seahawks front. Is Jody Allen trying to hold onto the Seahawks against her brother’s wishes?
The problem is, a lot of projects didn't make it into his list of passion projects. More:
> Sealed lips aside, here’s what we know: In 2010, Allen pledged to bequeath the majority of his wealth to philanthropy. (During his lifetime, Allen gave away more than $2 billion, according to the Chronicle of Philanthropy.) Tasked with this mammoth undertaking is his sister Jody Allen, trustee and executor of his estate, who is bound by her brother’s wishes as set out in the trust.
One can accuse his sister of not following his instructions, I guess, but unless we know what those instructions are, it isn't a very easy accusation to back up.
Selling an NFL franchise is also tricky business, and combined with Covid happening a couple year after Paul's death, it makes sense that it may not be sold yet.
I think the point is that Paul Allen could have endowed his favorite causes when he was alive with an endowment large enough that the interest earned would fund continued operation indefinitely.
Yep. I don’t think the LCM, Cinerama, or his military museum were his favorite causes. I think they were his favorite toys. He apparently had other causes which he cared more about for the future.
Probably not all of it, but he could have funded his most-favorite toys at the expense of funding some of his big gambles a little less.
And that's the truth of the matter; I see Paul Allen's betrayal not as not focusing on the big things, but as focusing on the big things too much that he didn't see the little things that made life worth living. The LCM is a victim of that for sure.
I totally agree, and that's how, for example, the rich Oxford and Cambridge colleges are funded. I think it might be a big ask these days, though, especially with returns on investments relatively low. Sadly.
Thank you for founding the MADE! I visited a few months ago and met someone there who has become a close friend of mine. And, it’s a great benefit to the east bay and gaming history more broadly.
Because this stuff is an important part of our cultural, artistic and scientific heritage. We do not get to decide what the future finds interesting. Therefore, we must preserve as much as we possibly can so that those in the future have the ability to pick and choose. As it stands now, a lot of human history is just pieced together from trash and rubble.
You can go to Rembrandt's house in Amsterdam, visit where he painted, slept, ate, taught, and stored his junk. You can see how he worked, where he worked, what the lighting was like in his house, what the door looked like then and now.
Atari, on the other hand, is just gone. We cannot see their offices (they've been reoccupied a lot), and we have very little in the way of assets generated from the creation process back then. It is currently easier for us to see how Rembrandt worked 500 years ago than it is to see how Atari employees worked 40 years ago. That's not a good state of affairs, considering how influential Atari was on the evolution of home gaming.
A great deal of information about computer systems and such from even fairly important computer companies more than 20 years old is pretty much lost forever. I was actually trying to reconstruct some details about one of the minicomputer companies that was purchased in 1999. There's very little left online outside of a Wikipedia article and information about individual systems pretty much doesn't exist any longer.
> We do not get to decide what the future finds interesting.
That's factually wrong. If you decide to utterly obliterate something, the future can't find it interesting.
> Therefore, we must preserve as much as we possibly can so that those in the future have the ability to pick and choose. As it stands now, a lot of human history is just pieced together from trash and rubble.
That's a value judgement that will eventually succumb to its own contradictions. Preserving stuff, especially preserving "as much as we possibly can" is a luxury that many ages can't afford. It's all going to end up as "trash and rubble" eventually.
If you want make something to survive for the long term, make trash and rubble.
>> That's factually wrong. If you decide to utterly obliterate something, the future can't find it interesting.
> Object X may be deleted but did you make sure all references to X were deleted as well?
IMHO, something with references to it isn't utterly obliterated. But there are a lot of things with little-to-no references to them, and those are things someone can totally decide to make the future uninterested in.
You've never heard of John Titor? In all seriousness while you don't have to believe in time travelers, there is a lesson to be learned there. Look what happened to the Digital Domesday Project. If preservation efforts had begun earlier it might all be online now.
I’ll answer the who: people with money, capital and time choosing on their own what to spend their money, capital and time on to the degree to which they can sustain to do so.
> It is incredibly poor planning on humanity's part to rely on the rich to preserve our shared heritage
You can swap "preserve our shared heritage" with many things and it still holds. It's incredibly poor planning to rely on the good-will of the rich for lots of things, yet here we are...
Because having a single benefactor for a non-profit is almost a death sentence.
Not necessarily, if the benefactor endows the NPO with a long-term trust rather than simply funding it. Endowment is what makes the difference between a charity and a hobby.
Paul Allen had a lot of hobbies, but he wasn't that charitable, in the sense that many of his good works seem to have died with him. There are obvious exceptions, but the LCM was sadly not one of them.
> having a single benefactor for a non-profit is almost a death sentence.
In a way, this reminds me of when Google shut down Reader.
This is also why you should only use standards, platforms and programming languages supported by a multitude of companies and organizations, and avoid those dominated by a single company.
I agree with your latter statement, but your choice of example feels weird to me… RSS was (and is) an open standard supported by an incredible number of both readers and publishers. How does it not meet your standards?
Being a computer museum is also an open standard, and there are any number of competitors. But it is still a problem when a large computer museum shuts down.
As a long term collector of old PC hardware, software and games, it's becoming pretty obvious that keeping aging hardware around has a very tangible expiration date to it. We are only able to keep this tech going because a lot of it was mass produced and we can find replacement parts, but we're already seeing massive failure in parts which render a device unusable regardless of how many blown capacitors or resistors we change.
There comes a point when the only way to keep this stuff running is either through manufacturing replacement parts (prohibitingly expensive) or through emulation. Emulation is cheap but completely unrepresentative of the original experience. For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.
So, I still collect this stuff and enjoy it. But I have given up my ambition for hardware preservation. It will take an organization much better funded than me to keep this stuff alive in the long term. And even then, I suspect a lot of it is going to be lost to time in the coming decades.
> manufacturing replacement parts (prohibitingly expensive)
Some are more than others. PALs and ASICs can be reimplemented in CPLDs and FPGAs. At come point it becomes Theseus' computer, but, if the goal is to preserve the boat, that works. If you want to preserve the wood planks, then you need to keep it powered down in a controlled environment for future generations to apply atomic-resolution sensing eventually.
> For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.
Not right now, but, soon-ish, we'll be able to do that very well with HDR displays. I already can't see pixels or jaggies on my LCDs. With proper computation, one could simulate the analog aspects of CRT tubes - its a lot for GPUs to do now, but, in a decade or so, I assume a mobile GPU would be able to do that without breaking a sweat.
Not long ago I was thinking about a CRT+deflection replacement that would take the analog signals used in CRTs (from an adapter on the neck, where the pins have a lot of variation, and the inputs to the coils), and, maybe, an extra power supply input to power the electronics, and spit out an HDMI signal on the other end.
This should be possible with modern flat panels to a point the image is hard to distinguish for all but the most extreme (think X-Y displays and Tektronix DVSTs) cases.
Curvature is an issue, but flat and Trinitron CRTs should be trivial.
That really depends on what you’re trying to emulate about a display. You can see artifacts from how the electron beam on a CRT paints the image by holding your hand in front of the screen fingers spread and shaking your fingers back and fort. Emulating that well might take a ~10,000fps display which I doubt anyone is ever going to produce.
I suspect even most hardcore retrocomputer hobbyists care most about emulating the parts of a display that actually came up in use of the machine. If I were an eccentric billionaire who wanted a replica of the Mona Lisa to hang on my wall and enjoy regularly without the inconvenience of weekly flights to France, I'd care much more about my money going into making a product that got the visible details of the canvas right (https://hyper-resolution.org/view.html?pointer=0.055,0.021&i...) than spoofed the proper results if I carbon-dated it or something. I think the same concept applies here. I don't really care as much if the only thing an emulator can't replicate is a clever but (to an observer) comically specific physics-based test for authenticity if I get everything I'd notice while using the computer correct at a fraction of the price. In the context of preservation, just knowing some other (far richer than me) person or org keeping a single-digit number of the actual artifact maintained for future reference is good enough for me.
I'm not sure I follow. CRTs draw the image to the screen in a fundamentally different way than modern displays due to how the electron beam moves sequentially left to right/top to bottom. This analog process, happening at 15 or 25hz, is what gives authentic arcade machines their look and feel. Same for old computer terminals. My understanding is that to reproduce this effect on a modern display, you'd need an extremely high refresh rate. To properly replicate this requires some pretty low level aspects of the system to be addressed. Hardware limitations are bound by the laws of physics after all.
Beyond just the aesthetics, there are practical reasons why this is important, whether it be lighgun idiosyncrasies or how the game "feels," which can affect timing and such for competitive players. There's a lot more to preserving the look, feel, and compatibility of displays for old computer systems than most realize and the rabbit hole can go quite deep on this one.
there are practical reasons why [how tthe electron gun works is]
important, whether it be lighgun idiosyncrasies or how the
game "feels,"
This is always interesting to discuss because there are so many factors at play! To put it in less than a zillion words,
The way a game "feels" in this context is essentially a function of input latency. The old-style "chasing the beam" hardware, plus a CRT display, equals something very close to a true zero lag environment.
In an ideal emulation situation, you could theoretically recreate something close to a zero-lag analog environment (in terms of latency) without necessarily simulating the path of the electron beam itself.
Although, as the linked article implies, there are a lot of bits in the emulation stack that would need to be optimized for low latency. High refresh rate displays get you part of the way there "for free."
Sure, and even many games don't particularly benefit from it. However, it's a really remarkable thing to play e.g. Mega Man or Smash Bros. in a true lag-free environment.
Perhaps. One issue I foresee is the way CRTS glow. The phosphor doesn't light/dim immediately the way an LED does. So there's some amount of fade in/out that happens on a CRT as the beam moves across the screen. I imagine this could be difficult or impossible to reproduce with a traditional OLED screen. Some old games rely on this technique along with the slow refresh rates to to create a sort of dithering/aliasing effect.
Phosphor decay is not terribly difficult to simulate to an acceptable degree. Doing it at the pixel level is pretty easy, doing it at the phosphor level is computationally harder but not much more complicated.
The larger issue w.r.t. this specific quirk of CRTs is that we're running out of human beings that are familiar with what this is "supposed" to look like, and actually care.
I'm not aware of any cases where it's been emulated in any acceptable manner. I can't be bothered to do the math myself, but I imagine doing this well would be beyond the capabilities of modern displays (probably in the 1000s of hz refresh rate). Maybe some special FPGA based controller with an OLED like was suggested above could make it possible. I'm not sure.
Each individual phosphor dot on a CRT is not terribly tricky to emulate.
The brightness at any given moment is a fairly simple decay function based on how long it's been since you lit it up with the electron gun. On top of that, you would typically want to apply some level of bloom to simulate the way light is diffused by the glass. Sure, you've got a few million dots to simulate, but this is also anembarrassingly parallel problem.
Now of course, admittedly, you're only simulating that phosphor glow decay at the refresh rate of your monitor -- 60hz, 144hz, 240hz, whatever -- instead of an effectively infinite level of steps as would be the case in real life. However, I don't think that is a practical issue.
You're clearly thinking of factors I'm not and I'm genuinely interested. To my mind, the visual aspects of CRTs are pretty easy to simulate, but not the near-zero lag.
The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining. And in vector graphics you don't have pixels at all, the light shines quite beautifully in a way I don't think is possible at all with backlit displays.
> The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining.
It would need to redraw the whole screen to account for the phosphor decay. To do that with line resolution and an NTSC signal, you’d have to redraw it roughly 1500 times per second (60 fields of about 250 lines). You’d draw the current line at full brightness and “decay” the rest of the frame according to that phosphor persistence. Since there is some quantization, you could reduce the frequency of decays as the line gets older.
On the concept of these very weird displays, I remember an HP oscilloscope that had a monochrome CRT and a polarizer in front cycling between R, G, and B components on every refresh cycle. Overall, the screen resembled a DLP projection when you'd see different color frames when your eyes moved, but a stable color when you were looking at a part of the screen. A very neat way of producing crazy small color pixels on a 7"ish CRT.
And yes, that device cost about the same as my house back then (2002).
I'll give you an example from the LCM. They had a PLATO terminal with its original plasma flat panel display. I'd been reading about PLATO for years and had even run it in emulation but I'd never seen actual hardware before visiting the LCM.
The experience on the original terminal was way different than emulation. The way the screen worked and the tactile feel of the keyboard was the core of the experience of it. Sitting at an actual terminal really changed my understanding the system because it gave me a physical context that software emulation could not provide. You'd be hard pressed to emulate the eye melting nature of the original plasma display or the stiffness of the keyboard.
The physical experience is a huge part of the overall thing. I have a C64 Maxi and it's absolutely amazing, exquisitely close to the original (but with an HDMI output and USB ports)
I'd care much more about my money going into
making a product that got the visible details
of the canvas right than spoofed the proper
results if I carbon-dated it or something
You've inadvertently highlighted one of the challenges of preservation: identifying which aspects matter.
Does fooling a carbon dating test matter? This is purely subjective, but for most people surely not.
But interestingly you've linked to an ultra high resolution image viewer that lets the viewer drill down into a nearly microscopic view of the painting. If a person doesn't know much about art, they might think that if you could take something like this and hang it on your wall, it would be a pretty damn good replica of the real thing. It would certainly be cool, I have to admit. Hell, I'd love it on my wall.
And yet, it's utterly different than the real thing. Paintings in real life are three dimensional. Van Gogh in particular is one who used thick gobs of paint. Each fraction of a micron of the painting has height and its own reflective properties which interact with the light in the room as you walk around and observe it.
if I get everything I'd notice while using the
computer correct at a fraction of the price.
Well, that's the thing. It's certainly up to the individual whether or not they give a crap about any particular detail.
If you don't care about how oil paintings actually look in real life, or what video games actually looked and felt like, and you choose to brand all of the things you don't understand or don't care about as "comical", then... well, more power to you. That's your choice.
Not right now, but, soon-ish, we'll be able
to do that very well with HDR displays [...]
flat and Trinitron CRTs should be trivial.
Visually I think we're really close in most of the ways that matter, with advanced shaders like CRT-Royale.
However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.
It's not possible to recreate this with modern tech. When we add up everything in the stack (display, input drivers, everything) we're looking at over 100ms of lag.
We're not totally without hope. As the (ancient!) article notes, higher refresh rate displays reduce many of these latencies proportionally. And for non-action games, latency doesn't matter too much in the first place.
At come point it becomes Theseus' computer,
but, if the goal is to preserve the boat,
[CPLDs and FPGAs] work
> However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.
It's hard to even match the input latency and general responsiveness of an NES hooked to a CRT TV with a composite cable with modern hardware, let alone something more-integrated.
My usual test is "is it very, very hard to get past Piston Honda on Punch Out?" Often, with an initial, naive set-up, it's nearly impossible. Get it dialed in a little and he becomes easy (if you spent like a billion hours playing that game as a kid, anyway). But with many display + computer + controller combos it's just impossible to get it close enough to right, no matter how you tune it.
That's my test because it's really easy to tell if the latency is bad, but if it is I'll find myself falling off things constantly in Mario, too, it's just harder to tell if I'm playing poorly or if the system's the problem. The NES is hard enough, add just a little latency and it's hellish.
Latency is one dimension. Another one is peripheral compatibility. Devices such as light pens and light guns are incompatible with LCDs. These peripherals depend on the timing information encoded by the raster scan of CRTs.
This means classic light gun games such as Duck Hunt for the NES are impossible to play without a CRT.
Doesn't duck hunt use an entire frame of black then white? You don't need raster scan emulation for that, just very low latency.
Edit: Apparently there is a 15kHz filter inside the light gun. So it's not really about beam accuracy or brightness, it's about pulsing at the right frequency.
LCDs aren't capable of pulsing at 15kHz. The twisting/untwisting of the liquid crystals is an electromechanical process and very slow (compared to a racing electron beam). Even though the fastest gaming LCD monitors claim a 360Hz refresh rate, they cannot get anywhere near this 2.8ms latency (implied) when going from black to white (0 to 100% brightness). Of course, the monitor manufacturers go to great lengths to avoid talking about it, so the whole industry is flooded with a bunch of marketing material to distract from the issue.
Yeah, I know. But you don't need the LCD to pulse, you need the backlight to pulse.
An LCD might still have issues switching fast enough, but an HDR OLED tuned to 15kHz PWM might be able to handle it. If it was designed with minimum latency in mind of course. Most screens buffer a full frame and that won't work. But playing duck hunt doesn't require timing the actual beam as it sweeps across the screen. You just need to be displaying rows with a buffer that's no more than a few rows high, and have the rows flicker. Also many third party controllers don't care that much about the flicker.
Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz. The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.
The really sad thing is that some day all of the CRTs will be dead and all of the expertise to build them too. The tooling and factories are already gone, so it's unlikely new CRTs will ever be built, unless some Ben Krasnow-esque super-hobbyist gets really passionate about it.
> Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz.
You could but it wouldn't have the same brightness as it sweeps so I don't know if that's good enough to trick a light gun by itself.
But I still think you shouldn't discount LCD. If you can get an LCD to switch a good fraction of the way in half a millisecond, and use the right backlight, you could make duck hunt work.
> The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.
To trick the frequency filter might be too much effort, but the latency of having each row come along instantly might get some effort into it. Marketing loves low latency.
The low latency claimed by LCD marketers concerns narrow grey-to-grey transitions. Black to white remains as slow as ever.
The other issue is all of the other causes of latency along pipeline. The NES emits an composite video signal directly from its video chip, the PPU. This composite signal travels along a coax cable into the back of the TV where it’s split by the analogue circuitry into signals driving the luminance and colour, synchronized to the horizontal and vertical retrace. The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.
That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.
> The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.
While that's true, doing the conversion doesn't need to add more than a microsecond of latency.
> That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.
An LCD doesn't have to buffer a frame either. I believe there are models that don't. It can display as it receives, limited by the crystal speed which is still an order of magnitude improvement.
Current consumer LCDs, sure, but there's no realson a high-refresh-rate LCD couldn't emulate the flying spot of a CRT, and thus be compatible with light pens / guns.
In order for that to work, it’d need to be able to switch individual pixels in sequence, one at a time. The display panel would need to be designed for this - current panels aren’t, but as long as a screen position could switch to 100% in about 250ns, a sensor could tell precisely which pixel it’s looking at.
Liquid crystals cannot switch from 0 to 100% in less than 10ms, never mind 250ns. They’re electromechanical devices that need to physically twist/untwist to affect the polarization of light.
Contrast that with a CRT which uses a 25kV acceleration voltage to drive an electron beam up to 30% the speed of light (takes about 3.3 nanoseconds to travel 1 foot from the back of the CRT to the screen), which then strikes a phosphor that glows due to its valence electrons falling from excited states (which takes a few nanoseconds).
> ...then you need to keep it powered down in a controlled environment
Mold and water damage can be mitigated with environmental controls but even then you're going to have decay issues because so many components just break down overtime. Many plastics just become brittle and disintegrate over time with or without exposure to UV. 10 year old devices have their softtouch rubber turning to a gooey sticky melting mess. Electrolytic capacitors and batteries leaking are commonly known but lesser known issues occur. The MFM Drive testing that Adrian's Digital Basement did recently comes to mind, 5 of 5 drives he tested were dead. One was a drive he'd validated a year prior and stored correctly.
I’ve never seen a Vectrex, but having seen other vector displays on old arcade machines, it’s truly mind blowing.
The warm glow of those smooth beams has no modern equivalent in consumer hardware. Despite technically being old technology, and some machines being over 4 decades old, it can still blow young minds seeing a way of rendering so different from what we accept as the norm today and how it’s in many aspects even more advanced than modern screens.
Knowing it’s something that’s gradually being lost with time is pretty sad. Previous human technology advancements would often be lost in a hole and still be recognizable for what they were centuries later when they were dug up. It’s kind of strange that the tech we make since the computer revolution basically dies along with their creators and maintainers and basically just turn into future door stoppers.
I’m not sure I follow your final conclusion… we mostly don’t use archeological artifacts do we? They are “recognizable” but so will old computers be 1000 years from now. With a good microscope you can still study the chips, etc.
If anything I expect computers from 1985 to be much better “preserved” than physical history from, say, 885, no?
Granted you have to resist the urge to use them as door stoppers and keep them in a controlled environment. But an archive is a lot cheaper/more stable than a living museum.
Just to use your example, you could dust off a 200 year old microscope and it’d be perfectly fine. If there’s a cracked lens, that’s an easy fix without any significant expertise.
Same with any old blacksmithing tools and so on. They’ll look old, but generally retain their functionality and a novice can repair them.
Old electronics require expert knowledge and specifically manufactured parts to fix. Knowledge of old electronics dies each year and the parts go away. A person who’s never heard of a Vectrex or see one in action would have no idea what it does. Even if they recognize that it has a screen, they wouldn’t be aware that it renders vectors and not pixels like nearly every other screen. Another 1000 years removed from the computing standards of today, people might not even be able to imagine what a typical desktop computer does when they look at it. But they’ll see tools from 885 and generally be able to guess what they’re for.
High tech things are very abstracted tools that are just going to look like useless boxes to future people. Most won’t know or care what they did. It’s weird to think about. Advances will also be lost with time without us realizing it. Vector screens are truly impressive, but people don’t know they exist. Will we someday get monochrome retinal implants and they’re convenient enough that people give up on traditional computer screens, and someday humanity forgets we all used to have full color displays with sound in our pockets? Who knows.
we mostly don’t use archeological artifacts do we?
Well that's the thing, right? A 5000 year old sword or a 50000 year old pot are static. It's not quite the same as actually using and touching the objects, but you can see them in the museum and it's fine.
Something like a car engine or a video game machine, that's a collection of moving parts. No single part of them is interesting. It's how they work together. Problem is, each time you use them they get a little bit closer to death and they also get closer to death just sitting there in a climate-controlled environment.
You do notice a lot of things about older items by actually using them. Little touches, weight, balance, the way the design is remarkably nice for some particular use or other in ways and for reasons you wouldn't have guessed just from looking at it, the satisfying feel of a hinge or knob or pull. Our local art museum has a collection of elaborate silver tea services—originally intended for actual use, not just decorative—and I bet one would notice a lot of things about them by actually using them, that one is unlikely to spot or understand just by looking, but no-one's likely to ever use them again—at least not for quite a while.
I love the Atari vector games! There’s an excellent video by Retro Game Mechanics Explained on how they work: https://youtu.be/smStEPSRKBs
Recently I stumbled on some Tektronix videos on YouTube. Those vector displays are even more mind blowing than Atari’s arcade games. I hope I can get to see one in person someday.
This can really be a labor of love. The volunteers at the Rhode Island Computer Museum, which is much more modestly funded than LCM, have ongoing restoration projects for rare vintage machines like the PDP-9 and PDP-12. They literally have a warehouse full of stuff like this[].
In the dark of winter, while I work to fill the hours, I troll ebay and Craigslist...looking at the $200 Macs and $400 amigas and $800 Apple ][s and $2000 complete NeXT stations, think it's another hobby that's gotten more expensive faster than I have the stomach for.
I also realize I'll get 95% of the enjoyment by just hitting the emulators at archive.org.
I gifted myself an Amiga Mini500 for xmas and once you play the games, and mess with Amiga Forever for an afternoon, and Amibian for ANOTHER afternoon...that scratch is pretty well itched.
A bit off-topic but relevant to the aging problem of tech: is there any commonly-available persistent storage medium you would use to store say, family movies for 100 years? Or is that a hopeless cause? For instance, will BluRay players be around for much longer? USB drives?
At what point are we where "Either put it in the cloud or lose it" is the law of tech?
I think it is safe to say you have to plan on moving media you care about to a different format every couple decades. From the first Edison wax cylinders, to today's latest format, the technology has changed that often, and if you care you have to follow. Sure you can keep older stuff working and some do. You can also say the same about photographs and books: most degrade after a few decades, but there if you spend $$$ you can get archival grade that will last for a century or two (many centuries if you store in a desert, but in human friendly places a century is all you should count on).
I have some archival grade CDs and DVDs that claim they will last longer. However I'm not sure if readers will exist as long as the media.
I've always wondered if you could do something with dot-matrix-on-aluminum-foil. It wouldn't look the prettiest and the information density would be low, but it seems like it would be cheap and last forever.
The scene mentioning the big EMP-reset in Blade Runner 2049 made me wonder just how much would be lost to us in such a scenario. Much that we care about is currently on highly sensitive storage.
What format should I invest in (developing/purchasing/etc) that would A) be easy-ish to read from B) be dense enough to exceed the point I might as well just be using paper/glass slides/microfiche/and so on, and could be ramped back up from pretty crude tech.
Combine "end of civilization", "time-travelling with tech", and other thought-experiment scenarios for fun.
If it is end of civilization you care about then print out survival instructions. Basic low tech things that one person can do in a farm sized area with minimal help.
Don't worry about advanced stuff. How to make a transistor will be lost long before we develop a new civilization capable of needing transistors. So don't waste paper on it.
I bet I could probably put movies and media on a BluRay using whatever format was the most stable over the past ten years and be sure that it would be readable by somebody without too much trouble 20 years from now. After that, however, planned obsolesce gets you.
I guess this is the first time I've faced the fact that unless it is printed out at archival quality in some manner that humans can read, it basically won't exist 200 years from now. The data will perhaps be around in some kind of conglomerated and bowdlerized format, but the "you" part of the data will be lost in obscurity, and that's assuming that some version of the cloud stays intact that long.
Thanks. I was looking for something that would work without me being around or there being any kind of script or computer for somebody to maintain. Something akin to a family picture album, only with digital media
I suppose you could always laser etch your bits on gold film and store it in a secure container. There are some products out there like the M disk that claim to last up to 1,000 years, but typical consumer storage media are only expected to last a few decades under optimal conditions.
We had a conversaton about this on Ars Technica and I can't currently find it. We ran into the same findings. I don't think you can fully set and forget it....3 copies, two formats, 1 off-site
I wouldn't count on 'put it in the cloud' being the answer, for a number of reasons.
Personally, I think that the only answer to the "preserve the family stuph for a 100 years" is to migrate it all to the most reliable media available every 5-10 years (it used to be every 3-5 years). Even if there was an archival media that was 100% reliable 100 years from now, will there be something that can read it?
> Emulation is cheap but completely unrepresentative of the original experience.
Never enough, but THANK GOODNESS for emulation!
My favorite old skool computer is the IBM 704 that LISP originally ran on. This machine didn't have a visual display or anything fancy. It was the sort of thing where you'd hand in your punch cards and an operator would run your job, then give you back the program results in your mailbox or something.
When I was doing research for writing my blog posts about LISP about a year ago, it was so helpful that a turnkey SIMH environment for this machine existed, that let me run the original software and then get definite concrete answers about its behavior. There were things I was curious about, particularly its handling of the T atom (and whether that remaps to TRUE which is the true truth) that I wouldn't have been able to talk about, had the emulator not been available.
The situation is only getting worse, as bootloaders are increasingly locked-down and internal hardware pairing is a thing (hello, Apple, but even AMD has joined that game with Epyc processors getting locked to motherboards). And then there's the soldered-in hardware that can't be replaced without a lot of expense and risk. Preservation at this point is going to be VMs and emulators.
And, in all honesty, while a working PDP-11/70 would be kind of neat, by today's standards it's dreadfully underpowered in speed even as it sucks down the kWh, and requires a lot of space, while SIMH on a modern SBC would run circles around it. A museum is a better place for it than a living room.
If you read the LCM tech blog it is exactly in this 'manufacturing replacement parts' where they absolutely shone.
Of course you will end up in a ship-of-Theseus like situation, how many replacement parts does it take before it is no longer the original. But that's because these are working systems, unlike you average museum where you just get to look at stuff and the cases might as well be empty.
My surviving stuff consists of a Vectrex, an Interact with many tapes (including Microsoft Basic), and an I Robot. I always thought the Interact belongs in a museum but they all seem to have one already or don't care.
Maybe the solution to hardware failure is to keep the hardware, working or not, AND have emulation. But as you say vectors don't emulate well.
There’s this pervasive idea that computers are solid-state machines meant to work forever. But they’re full of moving parts, grease, fluids, and fans.
Only now with our fanless machines are we approaching the true idea that these machines just turn on and work. I’d bet an Apple TV has a better chance of working in 400 years than a Commodore 64.
There's also a pervasive idea solid-state means works forever. Ask anyone who collects old computers and you'll find that ain't true.
Solid state devices fail with age too, via electromigration, temperature cycling fatigue, hot carrier injection, NBTI/PBTI and a host of other causes. And it gets worse the smaller the geometries we are working with. It's huge topic of concern in the industry.
We won't be around to know, but I'd be confident enough to bet you a beer that Apple TV will be dead as a doornail in years or decades, not centuries.
Yeah. I've got quite a few old machines from various periods, but I mostly just store them or have them on static display. I'm afraid to turn most of them on. The last one I tried was a Mac IIcx and it let the smoke out pretty quickly.
If anyone nearby is reading this, please consider coming by sometime :)
(You need a reservation. Not everything works. In fact most machines don't work probably. It's not super fancy like the Living Computer Museum was. There's a lot of stuff on the floor. It'll probably move to a bigger place soon so that'll get better probably. It's not particularly cheap; there's a yearly fee (no auto-renewals) and a per-visit fee. There's probably almost nothing to do if you aren't fluent in Japanese.)
So far I fixed: ZX81, VIC-20, PET 2001, various MSX machines. (See my blog for details :p)
While we're plugging computer museums, I'll add the Media Archaeology Lab in Boulder, on the campus of the University of Colorado. It's no doubt a far smaller operation than most of the others mentioned here, but it has a lot of heart. Visitors are encouraged to hang out, play around with all the old computers and gadgets, peruse the floppy disk collection, and maybe find some games to play.
I looked up the address, and just want to clarify for anyone who may want to visit: the museum is in Oume City, about 1.5 hours away by train from Tokyo Station. While technically part of Tokyo Metro area, I think for most foreign tourists "Tokyo" only refers the 23 special wards[0]. So just beware it's not just a short hop away after you're done sight-seeing in e.g. Asakusa.
It's pretty normal for the airport of a major city to be well outside of those major cities, given how air travel and tall buildings and large swaths of empty land for runways and population density don't exactly play well together. It's not usually that big a deal - for one thing, because you're likely to only do one leg of that journey on any given day; for another, because there's usually lots of good, tourist-friendly transit options to get from the airport to downtown.
So tourists are accustomed to this sort of thing, whereas - as pointed out above - they're not exactly accustomed to going that far out of the city while sightseeing "in the city".
I'm planning my next trip to Tokyo and got real excited reading about a Japanese computer museum (I'd love to get to play with some vintage Japanese micros, seeing how different they were from Western machines) but I don't know if I'd be willing to take three hours out of my vacation to go out to the suburbs and back...
I’ll be in Tokyo early next month with work, and this sounds like something I’d enjoy. Is it worth going as someone who doesn’t speak much Japanese? If so, I’ll get myself there :)
Hmm, well, I think it wouldn't be easy to make a reservation without being able to read and write Japanese.
Actually TBH I don't know if the owner can speak English :c Normally, first-time visitors are given a quick tour so they can decide whether to become a supporting member or not. I'm not sure how the owner would do that. (Maybe he would ask me to translate? I'd probably do it :p)
I can write/speak enough to make a booking for myself but if there would be alot of complex written stuff like histories and things maybe not for me. Thanks for letting me know :)
Just added it to my list. Might be a while - I live in Ireland - but who knows… a couple years back I was prospected to join Toyota’s “city of the future” project…
Sadly it was really on the down swing after the 2019 firings. I had been going to the museum for years ever since it opened. In 2017 and 2018 they started experimenting with live events and parties, like a Blade Runner 2049 release party with actors dressed up as blade runner characters, a full bar making futuristic mixed drinks, etc.--it was awesome. There was a tech themed burlesque show with performers covered in LEDs and such. They did other fun stuff like a retro 80s party night to celebrate opening a new 70s/80s themed wing of the museum. They would host local events like the retro game meetup, mechanical keyboard meetup, etc. It was really growing and turning into a fun geeky social space.
And then in 2019 Vulcan fired everyone responsible for making it great. No more after hour parties/events. Just as things were starting to take off they ruined it, and then the pandemic hit and it was done for good.
It's a real shame and I hope someone who appreciates the museum can take control of it and rescue it from Vulcan.
Does Jody Allen secretly hate Paul? Given the literal billions they have, it seems pretty spiteful how the LCM and Cinerama were treated.
I visited the LCM in 2016 as part of a computer arch course, it was really quite special to see working systems from the over 50 years ago and get to use them.
> Does Jody Allen secretly hate Paul? Given the literal billions they have, it seems pretty spiteful how the LCM and Cinerama were treated.
She probably has her own priorities, and is shifting things to match them.
IIRC, this phenomenon somewhat well known in philanthropy: if you setup a long-lived foundation, it will it will end up reflecting the priorities of the future staff even if those bear little resemblance (or are even opposed) to those of the founder. Everyone wants to do what they want, not what some dead guy wanted, even if all the money was actually his.
I believe this has caused some people to give up on the perpetual foundation idea, and structure their fortunes to get spent on their goals by a certain deadline.
It probably would have been better for the LCM if Paul Allen had set it up as some autonomous endowed thing controlled by a board of hard-core retro-computing enthusiasts.
It was well known that she had her own priorities before Paul's death. It's unfortunate he ignored that.
> Former members of the security detail... have said in sworn depositions that Vulcan CEO Jody Allen sexually harassed bodyguards while also directing them to smuggle animal bones out of Africa and Antarctica. At least two former employees said they heard Jody Allen had smuggled ivory out of Africa in violation of U.S. and international law...
For anyone curious about this phenomenon, the Ford Foundation is a notable example. Henry Ford was notably racist and the Ford Foundation does a lot that is focused on racial diversity.
Isn't that exactly the same phenomenon but with a positive outcome? Ford setup a long-lived foundation, and it ended up up reflecting the priorities of the future staff and bore little resemblance (or was even opposed) to those of the founder.
Isn't that to be expected? It was offered as an example of the phenomenon.
> ...but with a positive outcome?
There are people who think shuttering the Living Computer Museum is a positive outcome. I'd wager the vast majority of Americans would chose to spend that money on something besides keeping obsolete computers running.
I doubt she hates her late brother, any more than any widow who sells off or throws away her late husband's computer collection after his death (which occurs so, so often that it is probably the normal outcome after the collector's death, as opposed to an exception) hates him. It's disinterest.
Most late husbands don't leave an endowment big enough to cover a potentially perpetual upkeep for their collections either, but Allen wasn't most people, and he personally was significant in the history of computers obviously. Normally, I'm not a fan of these extremely wealthy people setting up dead hand organizations like charitable foundations to affect society after their death, but museums that preserve aspects of life from their times and information about their lives seem like a pretty good use case.
I have to imagine she doesn’t care, and 20 employees even is a chunk of change if you have some advisors looking at maximizing the dollars in whatever trust.
Of course the fact that Paul didn’t explicitly set something in stone for like 50 years despite the billions… really sucks, cuz that place was awesome
The contents of the will aren’t public but there’s been uncontested reporting that it calls for everything he owned to be liquidated.
“Paul directed that the trust be liquidated upon his death and the assets used to fund his passion projects,” a source said. “None of this is up in the air. The instructions are clear: The sports franchises and everything in the trust must be sold.”
Maintaining these projects ain't cheap even for billionaires (by that I don't mean it would make them broke (or even close), but spending hundreds of thousands or even millions for something you have no interest is gonna sound expensive).
They need to be financially self-sufficient to even have a chance, IMHO. Even then it isn't a given. Just like people sometimes clean their houses even when they still have tons of space.
Seattle could afford that easy cheesy. Really this is down to the lack of a strong CEO (probably because of the way this was incubated inside of Vulcan). A strong CEO could probably throw enough benefits and special events to run a 2-3MM operation and possibly get a proper endowment from some of the very wealthy people who live within the Puget Sound region. Oh, and that money would be tax deductible and you would get invited to some great networking events.
That CEO could also pay themselves $250k without blowback either.
Unfortunately it wasn't set up like that when Paul was alive (more like a personal collection). And after his death, the chance it became like that is even slimmer.
Allen had a net worth of something like $20 billion. That’s enough money to fund the museum for the next 5,000 years. Maintaining a project like this is NOT expensive.
It's not about how much money he (or his widow) has, it's about no one likes to randomly pour money into something they have zero interest in, and we're talking about money perhaps in millions per year. And the administrative headache.
Exactly. Money isn't enough to keep things running in perpetuity or even for a long time -- ultimately everything has to be administered by an organization that has a self-interest in sustaining itself with a board etc.
If money is simply set aside to fund something, somebody still has to hire somebody to manage the thing, and oversee their performance, and fire the manager if they underperform, and find a new manager, and sign off on big decisions like a change of space, and so forth. But if that trustee just... doesn't care to do that... then what exactly are you going to do? Somebody would have to sue to enforce that overseeing, but frequently nobody cares to sue or has standing.
Seems like the bills are still being paid, though. The museum is still physically there, and someone (Vulcan) is keeping the vintage computers maintained and operational.
Implement ticket sales, events, fundraising? Get a donation from Gates? Buy the land or move somewhere cheaper. Invest in some solar panels and skylights?
Also really sad about LCM closing and just generally about the way it was handled by Vulcan etc. For computer nerds that place is heaven. There is so much to do, and its wild seeing systems you may have owned at one point - and getting to actually use them again. When I went they had a sort of 70s/80s themed basement exhibit and "highschool computer programming" room, again with almost everything functional. You could even make your own punch cards. I'm on the east coast but would fly back out to Seattle immediately if given a chance to visit again.
For people wringing their hands about profitability, I'm sure it could have been made self sustaining, or hell if they had even attempted to raise some money outside of the foundation, there's a lot of people that loved that place. It really seemed like that the powers that be just didn't care.
Unfortunately it’s inevitable that LCM is going away. For folks outside Seattle who may be unaware, Allen’s military museum was closed and exhibits sold off last year, and his Cinerama facility looks to be heading the same way.
It’s hard to believe this was the estate plan he intended. A lot of local people are pointing the finger at his sister who appears intent on erasing his legacy outside pro sports.
All of those are significant losses, too. I don’t care as much about the military museum and there’s an ongoing argument about the ethics of flying some of those old planes (wear and tear is bad), as Allen did, but the collection still mattered.
I think the only Allen project besides sports that’s going to survive is MOPOP and that’s very tourist-friendly in an excellent location.
Unfortunately it appears to be the case that this is what Allen wanted. A connected sports journalist in Portland wrote about this last year (see below link).
> “Paul directed that the trust be liquidated upon his death and the assets used to fund his passion projects,” a source said. “None of this is up in the air. The instructions are clear: The sports franchises and everything in the trust must be sold.”
From someone two degrees of separation away from the situation, this really was a core issue. A lot of bickering involved from family members and outside interests on what's funded and how much. (yes, all second hand random internet rumors, heresay, take it with a grain of salt, etc.)
It's sad because the LCM and Cinerama had such an impact on Seattle culture.
Business Insider says brain research was one of them. I imagine if I was a competent financial reporter I could dig into where the money’s going and maybe learn something but I don’t have those skills.
Allen Institute for Brain Science and AI2 were never going to lose funding. Waaaayyyy too many people employed and involved with them with a fancy South Lake Union building they built. I'd love to see financials from them because they pull in a lot of research money and act as a tech incubator/investor. They're probably pulling significant revenue from those ventures.
I was lucky enough to visit the museum at the end of 2019, just before COVID-19 hit and the museum had to shut down. Unlike the Computer History Museum in Mountain View, the machines here were fully operational and you could actually get to use them - I compiled some hello world code on a NeXTcube and I still remember that fond moment. IIRC they had systems that you can SSH in and use, but seems like they've taken that down. Must be a recent thing since I remember they were operational a year or two ago.
The Mountain View museum is an expensive travesty. Another great "living" museum is the one next to Bletchley Park. You can program the Dekatron with paper tape and watch the base-10 memory cells changing as it runs.
The one next to Bletchley Park is The National Museum of Computing. It's a really great place and, IMHO, much more interesting than Bletchley Park itself.
Per the article, the parent company laid off all of the museum staff in mid-2020, so if they were working a year or two ago, they were doing so despite a lack of maintenance. Something probably broke at some point and there's no one left to fix it.
I managed to compile and run a hello world C program (it might've actually been B?) using ed on an old school Research Unix machine they had.
I also remember messing with the shell on an even older Unix machine that had an actual teletype printer terminal. And directly poking the VGA buffer on a C64 to do goofy things...
I only went once a while ago, when my family took a trip to Seattle, but that is one of my most treasured memories. It's a travesty that future generations of nerdy kids won't get that experience (and I'm sad that I likely won't get it again either).
It was the first and only time me and my brother saw a mainframe actually running. And hundreds of other early computers. Incredible place and such a shame.
I have been thinking recently, how could you design governance for a high-end nursing home so that the facility perpetually serve the rotation and interests of its residents. Without governance, such a facility would quickly trend towards entropy. The first step of entropy is pillage by the executive.
When I look around, adversarial systems (courts, parliaments) seem to be a device for building bedrock for better civilisation. I don’t think charity is, due to the governance obstacles.
Consistently finding and hiring the right people over time is just too difficult to be truly sustainable over multiple generations. A project will always be a few bad hires away from collapse.
Maybe a sufficiently advanced AI serving as the "right person" in a governance role is the long-term sustainable solution to this, once that is a capability. Sounds crazy now, but probably less so in 10-20 years.
Agreed. After I visited, I decided that that was the best way to preserve the fun of old computers and finally recycled my old Gateway 2000 tower from 1995 since they had like dozens of them there. Now I feel that I should have kept it.
As others have noted, this is an expensive undertaking, and unlikely to be self-funding, but also not that expensive in the context of the profits our industry generates and the numerous billionaires created therein.
I'm also interested in old cars. Many old cars are worth millions and people like Jerry Seinfeld and Jay Leno use their wealth to preserve them.
Unfortunately that scenario doesn't seem to exist for historic computers yet, although there is a collector scene at the low end (try to buy a KIM-1 now for less than $1000, for example).
I'd be willing to pay something in support of reopening the museum -- I already spend some money on picking up what I see as historic artifacts from auction sites and the LCM would actually be a more efficient way for my money to be used to achieve preservation of larger working systems. For example I have a complete board set for a pdp-11/70 (key milestone in the evolution of the systems we use today), but I can't afford to buy a rack, disk drives, and provide electricity to make a running system. The LCM already has a pdp-11/70 running V7. If I could fractionally own that, I think it'd be a good deal. I could also pledge my board set as spares should they be required.
But it would take 10000 people with my limited resources to properly fund the museum, I suspect.
This was a big mistake on his part -- assuming he wished for museum to continue. He had a number of months after the diagnosis to make these decisions so they wouldn't be left to others.
I was fortunate enough to go in 2019, pre-covid, while I had a day off in Seattle. It was a great museum, and I’m disappointed that I won’t be able to go back.
I'm saddened to learn that it is no longer opened to the public, but wanted to say thank you for sharing these photos. It makes me want to learn more about our industry's history.
What a truly handsome set of machines and I hope they find their way to another venue or become accessible to the public once more.
I loved this place, I spent a whole day here once just whimsically coding on all of the boxes that would let me. It felt like all of the stories I had read about the older days of computing truly came to life in front of me. Hell, I forgot the model but there was one that had an older typewriter-like interface that could play chess, I beat it and the museum staff took my picture! It was a magical place, truly sad I can’t visit it anymore.
Hi malwrar, I'm Hunter Irving, a former employee of LCM+L. I was the one who started the practice of taking photos of chess winners. I don't have them all, but some of the Hall of Fame photos (up to April 2018) are preserved here: https://hunterirving.com/blog/2017/the_hall_of_fame/hof/# Maybe you'll see yourself in there! The computer you were playing against was either a DEC PDP-8 or a PDP-8e. The program was CHECKMO-II ( https://www.chessprogramming.org/CHEKMO-II ).
Keeping old equipment going takes much time, effort and obscure knowledge. I spent years restoring old Teletype machines. (If anyone went to the Clockwork Alchemy steampunk conventions and saw the Telegraph Office [1], that was my doing.) Fortunately, many of those Teletypes were built, they're very well documented, they're built of good steel, and there are still parts available. It's easier to restore the all-mechanical machines of the 1920s and 1930s than the more electronic machines of the 1970s. Teletype built the first ink-jet printer, the Inktronic, and as far as I know, no working examples remain.
The Computer Museum in Mountain View has a working IBM 1401, and it's kept going by old guys who worked on the 1401 in the past. Soon, they will be gone. They have conflicts with the museum staff, who just want stuff that sits there.
The Union Pacific Railroad restored the Big Boy steam locomotive to full road operating condition a few years ago. That was a huge multi-million dollar effort. They now run it long distances for tours. Union Pacific has a Steam Shop and 50-100 employees. They get a PR benefit out of it.
The Museum of Flight in Seattle used to try to maintain some of their aircraft in flyable condition, but they seem to have given up on that.
The Jaquet-Droz automata [1] in Neuchatel, Switzerland, are still maintained, since there's still a mechanical clock and watch industry in Switzerland able to maintain such things and it's a source of national pride.
I loved that place, I spent an entire day there from open til close when I was contemplating changing jobs and it was really helpful to put my work as a developer into historical perspective. Running Unix commands on a PDP-11 and watching the output slowly scroll up the screen was awesome, and seeing the 17 step process to boot the PDP-7 taped on as stickies and then playing chess against it was cool. I always hope some other Seattle billionaire takes over the reins but at this stage my best hope is that my options are worth something in a few years and I can do it myself.
I volunteer on an event called the Microsoft Intern Game (a weekend long puzzlehunt for MS interns over the summer) and we were fortunate enough to get to use the LCM for one of the locations.
The puzzle (write up here, https://interngame.microsoft.com/events/2013/floppy/overview... make sure to click the solution link for notes about the happenings at LCM) actually used some of the real computers (running a simple DOS program from a floppy disc) as opposed to just emulating it on newer hardware. It was an amazing experience that you would never get at another museum.
My favorite pic had nothing to do with our puzzle at all, but was just a computer control panel with a helpful label that said “knob”.
Interestingly, the Seattle 'Museum of Communications' has remained open, rebranded itself, https://www.telcomhistory.org/connections-museum-seattle/ and joined sites with the 'Connections Museum Denver' in the past couple of years. Also not a high-budget operation.
And, in another signal for computer-collectors, "The Digibarn Collection is on long term loan at the Computer Museum at System Source in Hunt Valley, Maryland."
... https://www.digibarn.com/index.html
To add to the list of still active computer museums: Computer Museum of America (CMoA). As of a few years ago they were an integral part of MAGFest's Museum exhibit and it's really awesome to sit down and play games on a 50 year old machine.
I love this museum and I'm truly shocked to hear this news! I tried to go during the pandemic but I didn't realize that it would be gone forever... I still have a shirt in my closet!
Is there anything that we can do to help keep it alive?
It seems like if the company does want to try to operate it then they should transfer it to a non-profit and it can be operated by volunteers. Obviously that still requires money but theoretically it's less money. It seems like it would be a no-brainer for some local schools and universities to pitch in for funding.
When I read about the museum closing, it made me quite sad. I was absolutely thrilled when I visited it 7 years ago. I didn’t have time to go through everything I wanted, thinking that next time I’ll make more time for it. Now I realize how even things like computer museums in IT cities backed by IT billionaires are ephemeral.
This reminds me of a conversation I had with an archivist acquaintance I met at a conference. Her opinion was that I should donate my personally acrued private assemblage of retrocomputing game systems and personal computers to the Living Computer Museum, out of the belief that they'd be under better care there than in the hands of a private collector.
While LCM volunteers may have a keen interest in preservation, I don't feel there's the same sense of ownership as systems literally brought back to life/restored by the hand of the person who purchased them expressly for the purpose of sharing those experiences.
I think you're pretty much hitting the nail on the head there. I don't doubt that there are museums out there which do a good job on the preservation front -- but many I've encountered act as 'black holes' where rare artefacts go to never be seen or heard of again.
I've seen and heard horror stories of rare machines being destroyed because the CMOS or PRAM batteries were left in place while they were stored -- no reasonably clued-up collector is going to make that mistake. But some of these museums tie themselves in "Ship of Theseus" style knots by obsessing over keeping the original, dead batteries at the cost of losing the whole machine.
The best thing that's happened for software preservation is people uploading random disk images to archive.org...!
Always wanted to visit it whenever I was next in Seattle, I guess I will need to wait a while though.
In the UK we have a few smallish computer museums; one in Manchester University, one at a London University and then we have the National Museum of Computing at Bletchely park (not to be confused with Bletchely park museum, as there is some politics)
It is important in the case of NMoC that it is preserved, especially that they have a working Colossus machine and many other historic British computing achievements.
I got to go once in 2017 and really hoped to be able to go again someday... If it is dead, I at least hope their machines are donated to a similar museum elsewhere
If you visit https://wiki.livingcomputers.org/ you can learn about many of the remote systems that continue to operate. They are a mix of 18 real hardware and emulated systems with guest access, if you don't have a personal account. The wiki is also contributed to by the community of users that use these systems. In there you'll find code examples and user documentation regarding these historic systems.
Years ago I travelled to Paris, partly to visit another retro computing museum: the Musee de l'Informatique in the Grand Arch at La Défense. I’d heard great things so it was a huge disappointment to arrive and find it had “temporarily” closed a few days earlier due to problems with the building’s elevators.
Just yesterday I was remembering visiting this museum, and I decided to start making plans to go back. I didn’t realize it had shut down. That’s very sad news…
I'm surprised Bill Gates hasn't stepped in and taken over funding of Allen's projects. I know they weren't the best of friends when he left Microsoft, but I figured Gates would see the value of these museums in his advancing years, especially the LCM, which features mostly Bill's stuff!
I'm not much more disappointed; perhaps equally so. Both are cultural icons. I saw 2001: A Space Odyssey and all three of the Star Wars prequel films at the Cinerama. But I also spent a few very memorable days at the LCM.
I was lucky enough to go there once in 2019 and couldn't wait to take my kids there as they got older. What an incredible place, one of the very best museums I've ever been to.
It's a real shame if it does not re-open in some form or fashion.
Hmm, this is sad. Most big science/tech museums in major US cities are disappointing, but this was a really good one. I went in 2008 or 2009 I believe.
What I find weird is this is (basically) in San Francisco which is where Silicon Valley and related places got their start.
They have SO MANY rich tech mogules (mogulettes) that it's hard to imagine it isn't getting incrediable funding -- even by the city or Palo Alto for example.
What is the real problem? If I was a rich millionaire / billionaire from SV I would donate in a heart beat....
As someone who lives within biking distance of this museum and has visited it on multiple occasions, I can assure you this is very much in Seattle and not in Silicon Valley.
Ah, I've hallucinate it was in Oakland. My apologies. You're totally right. Mind you -- to extend my mistake-- they might have a better chance if they moved to the bay area. Not to minimize in any way the difficulties they are experiencing.
A Dutch non-profit, staffed by volunteers and people in need of practical training.
Visitors to the museum get to touch and interact with most of them. I've 'adopted' a few machines there that introduced me to computers and programming as a kid (the language switch features on this site are a little glitchy, google translate is your friend). If you're in the neighborhood, it's definitely worth a visit.
Their LinkedIn page has a proper introduction in English:
"The HomeComputerMuseum is a museum about (home)computers based in the centre of the Benelux (Belgium, the Netherlands and Luxembourg), in the technical region of the Netherlands, close to airports like Eindhoven and Dusseldorf and easy to reach by train and car. The museum is 500m2 and has around 400 computers in stock. From the early Commodore PET2001, Altair 8800, Tandy TRS80 and Apple II all the way up to early 2000 pc's with Windows XP and Mac OS X 10.5. Since computers will break when they're not used, all computers are prepared to stay on during opening hours and ready to be touched by anyone who wants to. Even printers, modems, scanners and other additional hardware is connected and ready to use. We also have our own repair lab when something breaks. All computers are set up in a environment that's styled in the correct era. It's not a museum, it's an experience!
At least once per 3 months we do a big LAN-party where the newest computer is a Pentium 4. Meaning we do only old school gaming with games like Duke Nukem 3D, Red Alert, Wacky Wheels, Unreal Tournament and anything we fancy to do.
We are a non-profit organization that gives chances to people with autism and teaches young children how the old computers work. We strongly believe that in order to understand the future, one must know the past. That's what we're doing.
See, feel, try, play & learn!
We also love to show these old, sometimes forgotten, computers on Youtube and on social media. We need funds to keep this museum up and running and that's where you can help!
We're based in Helmond, conveniently placed between big airports (Eindhoven and Dusseldorf), easy reachable by train, enough parking places and right next to the city centre of Helmond. So make sure to visit us and consider a donation or any other form of support."
Just a small reminder that everything you work for and thought important, everything you poured your soul and life into and consider your legacy, will die at some point after you. Sometimes only a few months after, when some random bean counter expressly breaks your wishes for some self-interested reason that you are no longer alive to dismiss as moronic.
Very, very few people can leave a lasting mark and especially change the world for the better.
Don't know about you but when I go use an old computer/videogame/candy/snack that I enjoyed as a kid, it usually does not evoke the same feelings as it did when I was first experiencing it. What I am getting at is that all the efforts were not wasted because for a moment in time, it evoked an incalculable amount of wonder, joy, and happiness to many people. The moment was bound to be gone for each person regardless of the status of the museum because it was not just the museum, it was everything that was unique to each person at the point in time. The Museum just happen to add a positive contribution to each person's moment.
I'm not quite sure what your point is, but don't you think that there's enormous cultural value in places like the LCM that allow more people to experience that feeling of wonder for the first time? Just because that feeling wanes away after the first experience, doesn't mean that these places aren't very valuable to society.
I guess I didn't explain it well, sorry. I agree that LCM is very valuable to society. I think I accidentally conflated two separate points. My first point is that yes the memories made are a moment in time that the mind cannot articulate correctly in present time (hence nostalgia evoking feelings better than what reality was at the time) BUT and this is important, even if LCM goes away, the effort they did spent was not in vain because it created those moments for so many people.
OP stated that everything you do will eventually die at some point after you. I disagree because we don't know if the positive impact that LCM created on someone could have then translated into a positive impact that a person who experienced LCM passes onto their friends and family. That positive impact could then traverse downwards.
I had this same exact discussion with some guy manning the Atari 800 booth at a recent Vintage Computer Festival East event. He was showing off cool demos running on the computer and we talked about how after he dies he felt that no one will care about this computer equipment anymore. The love for this platform dies with him and his generation.
It got me thinking that this item was made for a certain era which consisted of certain cultural elements that existed not only in the country where the item was designed, but also whatever thinking that was prevalent in specific components like the CPU design, computer science advancement at the time, marketing, etc. In other words, the computer was made specifically for people living at that moment in time. There probably wasn't much thought put into how different future people would perceive it because you don't know what the future holds and at the point of sale those people don't exist and don't matter.
At the same time though the memories that the computer generated will not be lost forever. They could have impacted downstream events (eg. the children of person could have been motivated to go into tech thanks to the love that the computer generated for their parent who subsequently went into tech)
Well. It sounds like the museum was not financially self-sustaining (which is fine) and the founder did not leave a well endowed foundation to maintain it.
Indeed, the more one achieves, the more one contributes. One might even suggest it is the defining characteristic of life - a better way to increase entropy.
I dunno, those forests were going to burn eventually anyways-- so is that a contribution?
To really make your mark on entropy you have to bring things to a lower energy state that requires crossing a substantial barrier, such that it wouldn't happen on its own.
No, you want a stable ecosystem that can keep alive as much life as possible which can maximise entropy production over much longer timescales. It's necessary to maximise the whole system's entropy exhaust over time. Much better if we can get people to colonize new planets, or create a Dyson sphere or whatever.
While the things we build will inevitably end at some point, the ideas they inspire in others will live on through their creations and so on. Exploring and executing on ideas is perhaps the biggest impact any of us can have on future generations.
However, our little museum run with volunteers and pocket change managed to be more stable long term than the well-healed and well-funded Living Computer Museum. Why? Because having a single benefactor for a non-profit is almost a death sentence. It's super risky, and as we've seen here, when that single donor goes away, it's all over.
It's super sad the LCM is in the state it is in, but for non-profits this is a cautionary tale: you cannot build a long term non-profit on the support of a single person. The world is too chaotic for that. Even a non-profit with a billionaire behind it isn't safe. It is incredibly poor planning on humanity's part to rely on the rich to preserve our shared heritage out of some obligation or moral or ethical duty. We need to preserve this stuff on the order of the Smithsonian.