Guess the old "total cost of ownership" anti-Linux FUD slides from Microsoft from back in the day need an update.
It's been a while since I read that stuff but I doubt they included "expensive payed for security fixes after product EOL" on their side.
Pretty bizarre situation though. Why did the Navy not migrate. It's not like the EOL of WinXP was the Spanish Inquisition.
I'm speculating here, but from my experience with Windows powered oscilloscopes: Windows is on the measurement device, it's not a networked or USB controlled peripheral with a documented network protocol connected to a standalone PC. When you open up "Device Manager" you'll see a bunch of specialized USB and PCI/PCIe peripherals that make up the actual measurement function and user-interface.
So the company selling the spectrum analyzer would have to publish quite a lot of their internal documentation regarding hardware registers for the data acquisition boards, and they'll be reluctant to do this: Much of a modern measurement equipment's functionality is inside the data processing, and by documenting the interfaces it would make it possible/easier to reverse-engineer or extent (without paying for options) the functionality.
If the buyers were to stick together and only buy from suppliers who allow access to modify and redistribute the source code of these devices, perhaps they could create incentives for the devices to function with newer software.
Why would the navy pay for support if the computers on xp were airgapped? The only reasonable conclusion is that these computers are on the network accessible from outside.
Consider that the US was able to attack air-gapped computers in Iran and destroy industrial equipment. Air gaps can be breached, it's just a little harder.
I doubt they included it on either side. How many Linux vendors still offer security updates for distributions from 2001, at any price?
As others have pointed out, paying for support is likely much cheaper and less disruptive than developing, re-testing and re-verifying new versions of their systems. "If it ain't broke, don't fix it." I'd imagine when they do eventually migrate they'll try to stay on the new systems for as long as possible, as well.
I almost fully agree with you. But... when looking at broader spectrum (Unix+Linux), I was tasked recently with migration of some sort of legacy app suite from AIX to Linux. Part of this were Bash+Perl scripts for various document modifications for printing purposes. What a clusterfk those scripts were...
I am (still) no expert, and Java developer on top, so working with these in vim/ultraedit was a bit nightmarish compared to what even basic Eclipse can offer you for debugging in Java. But the worst issues were bash comamnds that compared to AIX had a different syntax, and they for example ignored some parameters, or some params had different meaning... Bear in mind I was not in position to pick up which version of bash, perl etc gets installed where.
Any idea what the WinXP machines are being used for? If it a key part of running a ship, it may cause a flow on effect to changing other parts of the ship, kinda like a refit. However, I guess that there are more in their operation than just running ships.
I believe the phrase "if it isn't broken, does fix it" is at play here.
XP was almighty popular for any sort of scientific instrumentation, in no small part because of its good compatibility with Windows 9x (!). Manufacturers of advanced equipment tend to worry about small things like Proper Science rather than OS versions and other hipster stuff, and they're often small companies, so they historically tend to go Minimum Effort Required when it comes to interfacing with computers. They're also very worried about performance, so they have to go low-level, increasing the likelihood of incompatibilities with the latest and greatest and reducing maintainability in general.
A lot of drivers were developed back in the '90s and just tweaked for compatibility when absolutely necessary. Long-term compatibility is also why that world tends to favour Microsoft, that's one thing Redmond really cares about.
Libre software and the associated freedoms aren't (or shouldn't be, at least!) "hipster stuff"! They are also in no way in conflict with Proper Science. One might almost argue that with regards to reproducibility of results, Proper Science demands a certain amount of Freedom.
If it's Libre (which i suspect RH isn't — but i am no expert) then all the source code should be available, and presumably you could hack the drivers (or whatever) to work with whatever new-and-improved hardware/software you want to run. That's kind of the point of Libre.
read my post again: most manufacturers have no time for this sort of diatribe, and most users have no time or inclination to "hack the drivers" -- especially in rigid "efficiency-first" organisations like the military.
a) throw away a $300,000 piece of equipment for lack of drivers, assuming the company producing it has disappeared without all trace
or
b) spending less than $300,000 on employee time to make it work again
then to me case b sounds preferable, even in efficiency-first organisations. The military, i don't know. All i'm saying is that b wouldn't even be feasible without stuff being Libre, so diatribe or not, it actually would benefit public, private and commercial users' interests.
Because it's likely that a lot of their equipment was actually developed for Win95/98/2000 or even DOS.
> I've heard security called many things, but I think this is the first time I've heard of it referred to as 'hipster stuff'.
It was tongue-in-cheek, but what I meant is that most people don't upgrade their OS for fun (or for security) -- they do it only when forced by external pressure. Dunno about you, but I'd rather have nuclear submarines running on well-known and (literally) battle-tested software, screw smooth animations and glassy windows.
If you don’t interface with external systems, and your computer has no IO ports open to its users, well, then it doesn't matter. You could run DOS 6.2 and it would work just as well, security is not an issue there.
Usually it's important tasks that are difficult and expensive to retrofit.
I had a customer who this day has a distributed system running Windows NT4 on Alpha for s critical business system. It was cheaper to setup a dedicated network then to deal with the application. IIRC, they now have a 5 year project to replace it that is just starting.
We have a few machines that are controlled by software that only runs on NT4. They've been air-gapped and run on their own domain for security reasons, but we can't get rid of them without spending millions on new machines (this is a high-tech manufacturing environment). The vendor of the current machines no longer exists and in those days we never thought to include software escrow in contract negotiations ... not that we'd have the gumption to attempt updating industrial machine controllers ourselves anyway, but still.
Applications for specialised hardware reading between the lines of the OA. That means getting the suppliers of the original hardware to rewrite the applications to work with later versions of the OS and that might take time.
I can confirm there are plenty of businesses running various systems on XP. It can get even worse, at work we STILL use XP 32 bit as primary workstations (at least Linux on all servers). I am a Java developer, so routinely I have over 3.5 GB of memory taken, on system that can access cca 3.2 GB max. And we're talking about virtualized remote machines, no real desktops (yes, it's crap). At least at the end of the year, Win7 64b coming.
Main reason might not be XP as much as that plague called IE 6. Couple of important intranet apps run only on this. Migration underway, but this isn't apparently such a priority for our management.
What backwardish 3rd world company I work for you ask? Well, one not really tiny private bank in Switzerland...
That's hilarious and just what I would expect from a bank in Switzerland. I am a Swiss programmer myself and, seriously... that's one of the reasons why I stopped working for certain Swiss companies.
Three years ago, I worked for a quite popular hosting company. I coded a few features for a web app and had many restrictions, because people at Credit Suisse still had to use Internet Explorer 5. Well, at least I was told so...
That's interesting to hear about Switzerland, but it's actually banks in general. I had to support IE8 up until recently on various financial services websites for the company I worked for not because of any actual customers but because our partners at various banks in Australia are often still using XP and even those that have migrated to Win7 are still limited to IE8.
My friend works for a large US bank and told me they are generally limited to IE8 but he had "bribed" the IT department to give him an exceptional upgrade with some fine beer.
Switzerland is notoriously known for being a heaven for tax evasion, big financial criminals etc. Why they should be fast to support something new if what they have is not broken ? It's not like they are offering a SW related services. Not at all
maybe because support is gone unless you pay hefty sum, which might hike up in near future or be discontinued altogether? all software is broken in numerous ways, super complex things like OS even more
did they consider to work with beloved IE6 on a virtual machine instance? My uneducated guess is that it would be cheaper, even when counting expenses for training users to start the VM.
This sounds like a great use of taxpayer money. $30M is chump change compared to the upgrade cost, and as I've grown older I've developed a reverence for working code and working systems.
> $30M is chump change compared to the upgrade cost
Which said upgrade is going to happen anyway, so it's not staying fees vs upgrade cost, it's staying fees (however long it's dragged out) plus upgrade cost.
The proper comparison is $30M vs the cost of not doing "business".
(Notice the floppy disks. They might actually be safer than USB drives, since the latter introduces considerably more attack area, whereas a floppy is an extremely dumb storage device.)
In any case, from a risk-management perspective, I believe that software tends to get more stable over time if the only things being done are bug fixes; it's the radical rewrites and adding features that comes with new versions that bring more bugs. If it works, why "rock the boat" with new unknowns - there is more to lose than gain in this situation. I wouldn't be surprised if almost all of the important bugs in XP have already been found and fixed, and the limitations identified. It's like an asymptotic curve.
Does anyone know what they get for $9M/year? That sounds awfully cheap. A few engineers to port and install security patches? I guess there must be a few businesses out there paying, so what is the total money invested in maintaining Windows XP?
I guess that one problem with keeping Windows XP alive is that with fewer people using it over time, the chances of discovering flaws which need patching goes down. But maybe these guys don't care about that because all their stuff is offline with epoxied IO ports, yes? /s
The warship has been decomissioned, but clearly Windows is more durable. They also clearly managed at least one upgrade in the past, from NT to XP. Maybe they're having trouble with UAC.
That warship was NOT disabled due to NT issues. It would have been exactly as disabled if they had used Linux, or Solaris, or OS/2, or any other modern operating system.
They were using a client/server architecture, where the clients were essentially smart terminals for data entry and display. The failure happened when someone entered a 0 in a field that was not supposed to ever be 0. The terminals did not error check that field and reject bad values, and the server did not error check its input (probably it was written under the assumption that the terminals did the validation). The result was that their server application divided by 0.
The application did not trap divide by zero exceptions, and so NT did exactly the same thing nearly every other modern OS, included nearly all Unix and Unix-like operating systems, does when an application does not trap this kind of exception: it terminated that process.
The application developers had not made provisions to automatically restart the application if it failed, and the terminals couldn't do anything with the server application down, and so the ship was dead.
This article is incredibly biased against Windows.
> when the software attempted to divide by zero, a buffer overrun occurred
While it's possible some poor exception handling lead to a buffer overrun, it sounds dubious. Your explanation sounds more likely - do you have any references?
The various random quotes regarding Windows NT's fit for purpose are highly opinionated. The article doesn't mention that at the time Windows NT was certified at the NCSC's C2 rating level; while I'm just guessing, it seems entirely reasonable to select Windows NT because it was the only C2 certified OS with a GUI, which may have simplified development and systems integration given that some of the applications required user input.
The grandparent comment mentions it was client/server, which probably means a network. AFAIK, Windows NT was certified as C2 only without a network, see for instance http://csc.columbusstate.edu/summers/NOTES/CS459/NT-C2.htm ("Windows NT's C2 certification was conducted on a stand-alone computer. Hence the computer needs to be disconnected from the network by uninstalling all network hardware and software on the system.")
I don't have any specific reference that covers all of it. There were a lot of different stories reported, so it was more of a synthesis of all of them, with some filtering taking into account what I could guess based on titles of people quoted and how they phrased things if they were technical people or management. There are also a lot of details differing in the various articles. For instance, someone said the ship was towed to port and took a couple days to fix. Later reports said it was simply stopped for a couple hours while they fixed it at sea. Then there was a dispute between the story that reported the towing claim and the person they quoted for that, with the later saying he was misquoted and never claimed it was towed, and the magazine insisting they accurately quoted him.
It's possible that I've misdiagnosed that the exception was not caught. It is also consistent overall with the reports that it was caught, and so rather than being terminated by the OS the process ignored the divide by 0 and so ended up using some invalid result, leading to the application failing.
MSFT still sells security updates for Windows 2000, It would not surprise me if you'll peal enough layers in a large organization to find NT5/4 machines still there which require the sacrifice of 3 goats and a virgin every full moon to continue running which might still get updates from MSFT since that organization pays for the extended support.
The only thing that surprising in these stories is that MSFT is actually capable of providing support for products for such a long time, their ability to maintain information and transfer it to new employees must be unparalleled.
The amount of documentation alone is probably enormous 12 years of 10000's of bugs for each specific version of each binary that's insane, especially considering that most companies out there will have issues supporting binaries which are 2 years old since they have no clue what exactly was going on with them back then.
Spare a thought for the poor bastards Microsoft assigns this to - there can't be a whole lot of job satisfaction in porting bugfixes to a fifteen-year-old EOL'd OS for one customer.
You're wrong, old systems have such limitations that solving the problem demands cleverness and maks you feel good. Ask the guys from demo scene making new developments (!) for PC XT and CGA, posted recently here.
IIRC, the reason given was security reasons. It's black box security coupled with y'know, not being able to run anything modern, like a virus. And besides, does it really need anything else?
I'd guess/hope that this is mainly because of a bunch of embedded custom hardware (like PC-104 based systems) buried in different nooks and crannies of ships and weapons, as opposed to crufty VB and IE6 dependent office productivity stuff.
Pretty bizarre situation though. Why did the Navy not migrate. It's not like the EOL of WinXP was the Spanish Inquisition.