I'm typing through a dual ps/2 mouse and keyboard to USB adaptor. Mouse side is empty and keyboard side has a gc56mf to ps/2 adaptor on it. Because my keyboard is from the 80s.
Depends which keyboard I use, but sometimes, same here. My main work PC has a 1991 Model M on it. Best PC keyboard ever made, and I have about 5 of them, carefully kept since people were throwing them away in the 1990s.
I remap Caps Lock to Super, and I'm good to go. I really don't need media-control keys or anything, or a right Super key.
I am trialling an early model of Das Keyboard on my Mac. The volume-control dial is quite nice, because I listen to music and watch films on that. I never use the media keys, and happily, it's old enough that it doesn't have nonsense like light-up keys, multicoloured lighting or any of that ugly frippery.
Most computer lighting and decorations are visible designed to appeal to teenaged boys, with all the taste and discrimination that implies.
I mean, yes, Ctrl+Esc opens the Start menu, which the Windows key also opens.
But the Windows key -- before MS renamed it, the Super key -- is a single modifier key in its own right.
Win+R = Run
Win+D = hide all windows and show Desktop
Win+E = open Explorer
Win+SysRq = open System Properties.
There are dozens of them; these are just a few of the ones I personally use.
Other OSes and desktops use it for other things. In GNOME it opens the Overview. In Unity it opens the search function, and Super+A opens the app search function, and Super+S opens the virtual-desktop switcher, etc.
It has lots and lots of uses, and opening the Start menu is just one. Ctrl+Esc doesn't do any of these other functions, and it cannot do them because it's not a modifier key so you can't use it plus another key at the same time.
People often say that they can't use an old, pre-Win95 keyboard because they need the Windows key. What I am pointing out is that it's easy to work around that, by sacrificing another key that's not very useful: Caps Lock.
This also sits in the same location as Ctrl on the old Sun keyboards, which is a convenient position for a modifier key -- it's easy to press CapsLock with your little finger while using the rest of the hand to press the other key of the combination.
Well, then I might as well point out, that IBM did make a 8-bit adapter for VGA, so I could use my Sony Trinitron with my PC XT 10Mhz clone. Beat the daylights out of my roommates 6Mhz AT, and ran Xenix, which most unfortunately turned me into a Xenix expert of crap ware. After a bad compile, it destroyed both drives file system, so I went back to just Windows 1.0.6, and DOS.
The IBM PS/2 Display Adapter was a confusingly named 8-bit ISA VGA adapter. The PS/2 in the name referred to the fact that it allowed your prior generation machines to run the same graphics system as the PS/2 line.
When my IBM PC 5150's CGA monitor lost its vertical sync (and I was on a BBS and vibrated my lips so I could read the text to logout properly ... really!) I got a Cardinal VGA card for the 8-bit bus.
So yeah, VGA was possible all the way back to the original.
Not to understate the importance of the PS/2, but...
To say OS/2 failed because it didn't run more than one DOS app at a time is not really accurate. Well before Windows 3 (or NT) was established, OS/2 had its own app ecosystem and it was very pleasing to use, if you had a fast computer. Early OS/2 was meh, but from Warp and on, it was a very nice desktop OS.
OS/2 died because Microsoft cut deals with all big OEMs where they got Windows almost for free, as long as they didn't offer OS/2. It didn't help that, in order to run Windows software, OS/2 needed an expensive Windows license (and an almost full Windows).
There were other personal computers with GUI's back then as well - the Mac is still with us (well, not really - all current Macs are more NeXT than Mac), but there were Atari's and Amigas (and some Ataris and Amigas could run Unix). By then, any 386 with sufficient resources could run X (Sun had the 386i, running Unix and DOS apps side by side).
In the end, OS/2 failed because Windows had to succeed, and Microsoft made sure it did.
Are you talking about Microsoft/IBM OS/2, i.e. 1.x, or IBM-only OS.2, i.e. >=2.x here?
For 2 and above, the IBM only versions, well, yes, kinda sorta. But that was
too late. The game was already over.
OS/2 2.0 came out in April 1992.
Windows 3.0 came out in May 1990, 2 whole years earlier. It already had established an ecosystem before 32-bit OS/2 appeared.
Secondly, OS/2 2 really wanted a 386DX and 4MB of RAM, and a quality PC with quality name-brand parts. I owned it. I ran it on clones. I had to buy a driver for my mouse. From another CONTINENT.
Windows 3.0 ran on any old random junk PC, even on a PC XT class box with EGA. At first only high-end users of high-end executive-class workstations got the fun of 386 Enhanced Mode, but that was all OS/2 2.0 could run on at all.
OS/2 died when OS/2 1.x was a high-end OS with low-end features, and a cheapo low-end 386SX PC with 1 or 2MB of RAM, with MS-DOS and DESQview (not DESQview/X, just plain old text-mode DESQview) could outperform it.
(Remember the 386SX came out in 1988 and was common by the time Windows 3.0 shipped.)
> Windows 3.0 came out in May 1990, 2 whole years earlier\
Oh yes. IBM, it seems, was late in getting the news Microsoft wasn't as committed to OS/2 as IBM was. I imagine what was said at IBM's top management back then.
What Microsoft saw that IBM didn't was that having a continuous path without breaks was more valuable than having an objectively better product. That's also why we are using PCs today and not Amigas, Ataris, or Suns.
I wouldn't be so sure about that. It was widely discussed in the media at the time.
In my then-job, my boss sent me on a training course for 3Com's new NOS, 3+Open, which was based on OS/2 1.0.
I did not realise it was a clever evaluation strategy. He knew I might have enthused about it if given a copy to play with. Instead, being trained on it, I was told some of the holes and weaknesses.
I came back, told them it was good but only delivered with OS/2 clients and had no compelling features for DOS clients, and they were very pleased -- and the company went on to start selling Novell Netware instead.
Looking back that was a good choice. In the late 1980s Netware 2 and 3 were superb server OSes for DOS, and OS/2 with LAN Manager wasn't.
But yes, I think from soon after OS/2 launched, it was apparent and widely reported to IBM that MS was not happy, and as soon as MS started talking about launching a new updated version of Windows -- circa 1989 -- it was very clear to IBM that MS was preparing a lifeboat and would soon abandon ship. By the time Windows 3.0 came out MS had left the project. Its consolation prize was to keep Portable OS/2, also called OS/2 3.0, later OS/2 NT.
This wasn't secret, and everyone in the industry knew about it. The embarrassment to IBM was considerable and I think that's why IBM threw so many people and so much money at OS/2 2.x. It was clear early on that although it was a good product, it wasn't good _enough_.
NT 3.1 launched in July 1993, almost exactly 1 year before OS/2 2.1, and NT made it pretty clear that OS/2 2.x was toast.
I deployed NT 3.1 in production and supported it. Yes, it was big and it needed a serious PC. OS/2 2.0 was pretty happy on a 486 in 4MB of RAM and ran well in 8MB. NT 3.1 needed 16MB to be useful and really wanted a Pentium.
But NT was easier to install. For instance, you could copy the files from CD to hard disk and then run the installer from MS-DOS. OS/2 had to boot to install, and it had to boot with CD drivers to install from CD. Not trivial to achieve: ATAPI CD-ROM drives hadn't been invented yet. It was expensive SCSI drives and a driver for your SCSI card, or proprietary interfaces and proprietary drivers, and many of those were DOS-only.
NT didn't have OS/2's vast complicated CONFIG.SYS file. It had networking and sound and so on integrated as standard, while they were paid optional extras on OS/2.
And NT ran Windows 3 apps better than OS/2, because each Windows 3 app had its own resource heaps under NT. Since the 64kB heap was the critical limitation on Win3 apps, NT ran them better than actual Windows 3.
If you could afford the £5000 PC to run NT, it was a better OS. OK, its UI was the clunky (but fast) Windows 3 Program Manager, but it worked. OS/2's fancy Workplace Shell was more powerful but harder to use. E.g. why on earth did some IBMer think needing to use the right mouse button to drag an icon was a good idea?
I owned OS/2, I used it, and I liked it. I am still faintly nostalgic for it.
But aside from the fancy front-end, NT was better.
NT 3.5 was smaller, faster and better still. NT 3.51 was even smaller, faster and stabler than that, and was in some ways the highpoint of NT performance. It ran well in 8MB of RAM and very well in 16MB. On 32MB of RAM, if you were that rich, we NT users could poke fun at people with £20K-£30K UNIX workstations, because a high-end PC was as fast, as stable, and had a lot more apps and a much easier UI.
Sad to say, but the writing was on the wall for OS/2 by 1989 or so, before 2.0 even launched. By 1990 Windows 3.0 was a hit. By 1992 Windows 3.1 was a bigger hit and by 1993 it was pretty clear that it was all over bar the shouting.
There was a killer combination that had a chance, but not a good one: Novell Netware for OS/2. Netware ran on OS/2 and that made it a useful non-dedicated server. IBM could have bought Novell, combined the products, and had a strong offering. Novell management back then were slavering to outcompete Microsoft; that's why Caldera happened, and why Novell ended up buying SUSE.
(For whom I worked until last year.)
OS/2 plus Netware as a server platform had real potential, and IBM could have focussed on server apps. IBM had CC:Mail and Lotus Notes email, it had the DB2 database, soon it would have Websphere. It had the products to bundle to make OS/2 a good deal as a server, but it wanted to push the client.
Also the PS/2 keyboard and mouse connectors. Before that mice were speaking to serial ports or required an add in card with a nonstandard port. the people buying PS/2's were often very concerned with mice, they'd seen them on TV and they wanted one.
Same physical port. Perfect for plugging into each other and having neither one work. Later on they color coded them (purple and green, some companies used orange?)
I was always jealous of the Mac's ADB. You could daisy chain it, plug any device into any port, it was nice.
I think there was nothing more standard than RS232. You needed to add a "multi" board (joystick, parallel, rs232), yes, but that was absolutely standard.
"Bus" mice were popular too. [0] Cheaper to produce than RS2323 mice I think, but required the special interface card or motherboard support.
The bus mice were semi-popular with Amiga owners because they were easily converted to Amigas. RS232 mice could be used with Amigas too, but only worked in Workbench programs which used the operating system drivers. Most Amiga games etc used the mouse hardware registers and the RS232 mouse would be ignored then.
I ran OS/2 1.x (in particular 1.3 was nice) on a 386 clone PC.
Later I ran OS/2 2.x on a PS/2 Model 77 and later Windows NT 3.51
> That is why MS had the money to headhunt the MICA team from DEC, headed by Dave Cutler, and give them Portable OS/2 to finish. That became OS/2 NT (because it was developed on Intel's i860 RISC chip, codenamed N-Ten.) That became Windows NT.
Good bit of trivia. I remember it starting out as OS/2 v3 and being marketed as Windows New Technology.
What I remember most about the PS/2 is the crazy power switch that internally connected to a metal rod that connected to another switch on the PSU itself.
"According to Bob Dubke, the second engineer on IBM's 5100 team in
Rochester (who now co-owns a locally-based company called eXport
Ventures Corp. and also works for Edina Realty), that secret function
was his contribution to the design of the computer. The function, which
IBM suppressed because of worries about how their competition might use
it, was an interface between the assembly code surrounding the
computer's ROM exterior, and the 360 emulator hidden beneath it. (IBM
declined to comment for this story.) The 5100's emulator gave
programmers access to the functions of the monstrous, and much less
portable machines, that IBM had produced during the 1960s. An imprint
of a hook on the outside of the 5100 symbolized the ability of Dubke's
interface to drop into what Titor called "legacy code," and scoop out
any necessary operating instructions."
No doubt the pivotal importance of IBM's PS/2 range and I was instrumental in my organization acquiring many but that wasn't without problems as we already had many ISA machines so PS/2s weren't compatible at a board level. That increased the cost of maintenance amongst other things.
However, when I think back the two most disruptive problems for us were the development of the 80286 and the failure of OS/2. The 80286 was dog of a design not from its silicon per se but rather from a design concept. The switching modes problem, triple-faulting the processor and so on.
I understand the reasoning behind why Intel designed the processor that way as an explanation of why it did so took up most of the time of a long Intel seminar/presentation that I attended just when the chip became available.
It's easy to be wise in hindsight but it would have been better for everyone—except of course Microsoft—if the 80286 had never existed and that the 8086 platform had migrated directly to the 80386.
This is a bit of history, I had to search a while for:
"The designers of the 80286 based it on the Multics segmentation model, but in reality segmentation makes it really hard to implement, because of internal memory fragmentation. A while back I wrote blog post about how an alternative 80286 with segment registers pointing to 256 byte paragraphs giving a true 24-bit virtual address which was then paged into a 24-bit physical address space. 8068 compatibility could then be managed by having the segment registers map to 16 byte paragraphs as before."
But: The screwed up on the switch back to real mode, and on the reservation of the 4th byte of the descriptor tables, which made it so that Xenix-286 cannot run on a 386. ( long discussion... I think that MS purposefully made it incompatible. )
Segmentation is harder to work with, but it has benefits too. At the time, when it was all new, everyone just complained about it: they wanted the ease of use of flat, unsegmented memory, with just one memory address.
More recently, the NX bit has become desirable and promoted. The CPU can flag areas of RAM as data, and mark them as Non eXecutable. That is all: one bit, so either it is code and you can run it, or it's data and you can't.
But the original Intel 386 design had 4 protection rings: 0, 1, 2 and 3.
0: kernel mode, can do anything.
1: inner ring, privileged code, but with restrictions.
2: outer ring, some hardware privileges: can do device I/O.
3: user mode code, no hardware access privileges, has to request everything through the kernel.
OS/2 2+ used the middle rings, and Novell Netware 4.x, optionally, for NLMs (apps that ran on the server, which was rare for Netware). Nothing much else.
Microsoft only ever used 0 and 3.
But with memory segments, you could, in theory, flag each segment by ring. This is much more flexible and controllable than just one NX bit.
Segmentation had benefits, but they took more work, more effort and thought and planning. You needed to design the OS for it, and the drivers, and the apps.
But remember how DOS evolved: it's a clone of CP/M. No multitasking, no permissions, only 1 app in memory at a time.
Back in the 1980s, the industry was growing by adding stuff onto a brain-dead, super-simple design from the 1970s.
They saw segments as a hindrance, an unnecessary complexity. They wanted them to go away.
In the 1990s, we got 32-bit desktop personal computers and 32-bit end-user desktop OSes. Apple's design was a single-tasking OS and it had to buy a Unix vendor to make a grown up OS.
Microsoft's engineers achieved miracles bolted on to DOS, but in the end, it had to ditch its entire OS too, and build a whole new one: Windows NT. It took headhunted talent to make it work.
IBM dropped the ball, which is the whole point of my blog post that is the root of this discussion.
Now, we have CHERI, building protection segments back in again. I've written about that too:
It's a huge waste of human effort: we had them in the 1980s, but they were too hard and we threw them away.
UNIX did too. Multics used them, but Multics was too hard. UNIX was Multics with the hard bits left out. And now, 50 years later, we're discovering that we needed them all along and we're painfully and with great difficulty putting them back in again.
Excellent concise summary, glad you and ForOldHack beat me to it. I would have confined my reply to the ring issue having forgotten much of the controversy with other OSes—Netware for instance (even though I'd used it many times).
You're right about segmentation being hard work, I can recall cursing Intel at the time and hoping it would bring out a more eloquent processor that was easier to use such as Moto's 68000. However, there wasn't any thought of moving over to Moto because of existing investment in Intel's development systems-Multibus, and in the 8085 and 8051, they were still current and we were still using them.
"It's a huge waste of human effort:"
Absolutely, if I could have that time over I reckon I'd opt for a different profession. Everyone from manufacturers to end users wasted a huge amount of time and energy that resulted from stupid mistakes to outright shenanigans. We saw it all happen from a developer/end user's perspective and we weren't immune either (also it ended up costing everyone unnecessary money).
For me, the most bitter arguments were over the change from IBM to Microsoft which also included the shift from OS/2 to Windows. Technical arguments became irrelevant after senior management read press reports that Microsoft was the new 'in fashion' company when it came to computing. I recall saying to management "why waste all the expense on an engineering outfit when you can simply find solutions in the finance section of the newspaper". That didn't win me any Brownie points.
Why so many in computing and tech became so besotted with Gates and Microsoft and why more level heads never prevailed at the time has always troubled me. For instance, when Gates was quoted as saying "640K ought to be enough for anyone" I recall saying to colleagues who agreed with me "either Gates is daft or he's been forced to read from a PR release". Whilst there's now doubt that he ever said those words there was never any outright denial at the time. Similarly, the technical press never went out of its way to say how stupid the notion was. Being obviously silly or at best ill considered then why didn't the press say so? There's shades of 'the emperor's new clothes' here, methinks—and that's troubling.
I'm of the opinion that long after all protagonists of that era are dead and buried that researchers will tally up the actual cost of that wasted human effort and we'll find that whilst Microsoft wasn't completely responsible it nevertheless played an overly dominant role. The same applies with Windows and MS's deviations from the original CUA (which are still ongoing). Just think of the millions of hours lost worldwide by users having to relearn and futz with the UI whenever MS changes it. If dollar figures were put on this wasted human effort then they'd have to be staggering.
I find it difficult to think of a context in which Gates would say that 640kB was enough, given that the limitation came from the IBM PC. MS/DOS could quite happily address 768kB on an RM Nimbus, for instance. The more fundamental 1MB limit came from Intel. Gates didn't have to justify either.
There's no doubt that that quote was the talk of the IT world back then. It did the rounds in magazines and news outlets for ages and I cannot recall any of those articles doubting that Gates actually said it.
It was also wheeled out later in computer mags often as mocking comment when the '386 came out as by then PCs had 4MB RAM.
Doubt about the quote's origins only arose about a decade later after Gates denied saying it. I'm now racking my brain trying to remember which magazine cemented the idea that Gates was responsible. I already knew about the quote but this was a big splash and it had a photo of Gates and cohorts on a podium (product launch) at or near the top of the article. Could have been PC Magazine or possibly InfoWorld but I really can't remember.
I'd not really given the matter much thought until now but it does seem strange that doubt still persists (or that it does at all) given the widespread attention news reports paid to it at the time.
I wrote the comment and the blog post. What wording are you referring to?
Yes, Windows supported the 286 too. There was a special Windows 2 version for 286s, and a different edition for 386s; what was novel in Windows 3.0 is that one edition and one install on disk supported all three: 8086, 80286, and 80386, all in one. Windows 3.0 had 3 modes:
- if it had <2MB of RAM or just an 8088/8086/NEC V20 etc. chip, it started in Real Mode.
- if it had 2MB of RAM and a 286, it started in Standard Mode. No 386 features enabled.
- if it had >=2MB of RAM and a 386, it started in 386 Enhanced Mode.
It wasn't well-known even then, but Windows 2 and 3 could multitask DOS apps without a 386. V86 mode wasn't needed for the multitasking, but for the memory management. The multitasking worked, the problem was trying to fit several DOS apps into the base 1MB of RAM for them to multitask.
What the 386 delivered was separate 640kB slots for each DOS VM.
I read about the multitasking of DOS programs in some round-up of DOS multitasking systems (DeskView et al.), and convinced my boss to fund experimentally adding 4 MiB of RAM and a Windows 3.0 license to my 386-based clone. I got a build (cross-assemble) of my code going, and then entered the Brief editor, and pondered what exactly I had enabled. Then an in-house utility was invoked from the makefile to download my code into the target, and my whole system froze. Thus endeth the experiment.
Anything that hit the metal directly could cause problems, yes. You had to be pretty careful and test stuff first.
TBH I didn't use these multitasking tools much. I was a support guy then; I worked on other people's PCs so I needed to know how to use the built-in tools to best advantage. I could not rely on additional apps, utilities, etc. because the PC I was fixing almost certainly wouldn't have them.
Late in that era I carried around a wallet with about 20 floppies full of freeware tools and utilities for this reason.
But for me it was occasionally very handy to be able to go and make a directory, or format a diskette or something, while in the middle of something in an app.
But I wouldn't try something that wrote to the hardware, do file transfers or something. Serial ports if you were very careful to set them up so they didn't share IRQs, which of course on the PC was the normal standard config -- COM1: shared IRQ 4 with COM3:, and COM2: shared IRQ 3 with COM4:
While LPT1: was on IRQ 7 and why on Earth can I still remember this 35 years later? >_<
Even without the reference, You were there, and I was there, too. I had to support a client, who needed Win/386 2.0 protected mode for a DOS app, as well as run PageMaker.
Afaik other companies picked up 32bit simms very slowly. SUN in 1991 in SPARCstation IPX. First x86 support showing up in 1992 on 486 boards just before introduction of VLB slots. Apple was year later with LC III/520 and Centris in 1993.
Sure, yes, because the purpose of 72-pin SIMMs was for 32-bit wide memory access. Before 72-pin SIMMs were 30-pin SIMMs.
Simple arithmetic shows you that you can't route a 32-bit memory bus over 30 pins. You need banks of SIMMs and that means they need to be matched.
But the 386DX was too expensive, because it needed a 32-bit memory bus. So it didn't sell well and was mainly used in PCs used as servers. PC servers didn't really exist yet in 1985.
That's why the 386SX was invented, with a 16-bit memory bus. It meant the motherboard only needed a 16-bit memory bus, meaning it was much cheaper.
But that means it doesn't really need 72-pin RAM.
It wasn't until the 486 (and 68040) went mainstream that ordinary desktop computers needed 32-bit memory buses and therefore 72-pin RAM, but over time, economies of scale drove down the cost of 72-pin RAM so it became cheaper and gradually got adopted, and the 16-bit-bus 386SX (and derivative chips with the 486 instruction set, such as the 486SL and 486SLC) gradually went away.
68020 and 68030 were fully 32 bit, wasn't it? They had the dynamic sizing thing so they could boot of one EPROM, but they had the full bus expose AFAIK.
I have a 386SX that has a single 72-pin SIMM socket. It xseems to be a 1991 design by the stickers on components.
Even more weirdly, the machine could be equipped with a 286 daughterboard instead of the 386SX. I wonder if it just discarded half of each memory access to fill the 16-bit bus.
It's strangely overbuilt-- very feature rich BIOS, almost tool-less chassis, 72 pin memory socket, lots of surface mount parts, but paired with the slowest 386 on the market and a hard disc that was probably about 3 years out of date (Seagate ST351A/X-- capable of running in that weird 8-bit IDE mode) I have to figure it was Philips flexing their last few years of being a premium electronics manufacturer-- they can make a nice PCB and chassis but still have to buy the computery stuff from elsewhere.
It didn't originate with the PS/2 line, but the Model M keyboard with Selectric roots was standard equipment, an absolute beast, and the best keyboard I've ever had.
Your not wrong, but the design of the model M sought to emulate the feel of the Iconic Selectric, as well as the model F with the solenoids under each key, equally dissimilar to the Selectic's internals.
It's funny. The Model M was a way to make a cheaper keyboard than its predecessor, the Model F, which, in turn, was IBM's way to make a cheaper, lighter alternative to the beam-spring keyboards they were using in terminals and other "more serious" computers.
I regret so much having dumped that keyboard... best thing ever. Only problem: when typing at night (what I really liked) the whole block was awake! :D
Speaking of pivotal, both IBM and Microsoft poured an ungodly amount of money into translating Windows 95 and OS/2 Warp to a great number of languages, as far as I am aware this didn't happen before. I was a journalist at a computer monthly in Hungary at the time. Now, MS and IBM teams were not talking to each other, in fact Microsoft was working in "ivory tower" mode: the translation happened in house almost in secrecy. IBM was at least partially outsourcing to local firms who were coordinating with us. An awful lot of the very language pertaining to computers were invented or re-invented due to this. Yes, during the 80s of course there were already computer magazines and even the TV was talking about but it was absolutely awful and didn't fit the language well. The effort IBM put in was, well, pivotal. We didn't get paid -- but we had a major advantage of turning the magazine into readable and very soon we have eclipsed everyone else on the market. Of course this had other reasons but the effects of this can not be dismissed.
OS/2 Warp was out as I purchased my 386, and I picked up a copy, and I really wanted that to work, but it was already clear that IBM was playing catch-up with Windows because Warp was marketed based on how well it could run Windows stuff.
I thought OS/2 Warp was really neat. I worked IT at Bloomingdale's and at the time they were a strictly IBM house. They used IBM computers running OS/2 Warp in their Bridal Registry kiosks. Since this was a kiosk they had touchscreens and they worked very well and had a "sparkle" in the screen, as the early touchscreen CRT's did. If you lost power, or shut them down incorrectly you would get a bunch of DLL errors and have to re-image the PC.
Yes. Especially if you were putting an image back to a harddrive of the exact same size and specification, it was very doable. You could keep an untouched hard drive with a known install and copy it block for block.
The term means "can run IBM PC software". The PS/2 could. It was compatible. Same OSes, same apps.
Remember this was the 1980s. There were almost no protect-mode 32-bit OSes around, and the few that there were then had PS/2 versions. DOS and DOS apps ran fine on PS/2s, unmodified.
> It was thought that by creating a new standard, IBM would regain control of standards via the required licensing. As patents can take three years or more to be granted, however, only those relating to ISA could be licensed when Micro Channel was announced. Patents on important Micro Channel features, such as Plug and Play automatic configuration, were not granted to IBM until after PCI had replaced Micro Channel in the marketplace. The overall reception was tepid and the impact of Micro Channel in the worldwide PC market was minor.
> The PS/2 line was created by IBM partly in an attempt to recapture control of the PC market by introducing the advanced yet proprietary Micro Channel architecture (MCA) on higher-end models. These models were in the strange position of being incompatible with the IBM-compatible hardware standards previously established by IBM and adopted in the PC industry. However, IBM's initial PS/2 computers were popular with target market corporate buyers, and by September 1988 IBM reported that it had sold 3 million PS/2 machines. This was only 18 months after the new range had been introduced.
I was there. I installed hundreds of the things. I still own 2 of them.
The term "PC compatible" is about software, not hardware. It's not about slots. For instance, laptops were and are still PC compatibles and 99% of laptops have no expansion slots.
The PS/2 range ran bog-standard unmodified PC DOS and MS-DOS, and unmodified DR Concurrent DOS. It ran plain unmodified standard Windows 2, 3 and 3.1.
I put it on them. I deployed them in production. I have personally done this; this is not research or hearsay. I did it myself. In my basement right now is a PS/2 Model 80 with 16MB of RAM that runs Windows NT Server 3.51.
PS/2s were PCs.
Furthermore MCA was not "out of reach to clone makers".
I personally installed at least one Apricot VXFT server with SCO UNIX, and MCA slots.
MCA lost out to VL-Bus, EISA, and ultimately to PCI, but it did OK for a while.
There wasn't really anything wrong with MCA. Its issues were two fold and relatively minor:
[1] EISA, VLB and PCI could co-exist with ISA. MCA couldn't.
[2] Manufacturers had to pay to licence it.
If IBM's OS/2 plan had worked, and OS/2 had been a 386 OS and thus been compelling to DOS users, then in 1987, the PC industry would have had a pre-emptive multitasking OS, and by 1988, it would have had a GUI too.
Windows 3 would never have happened. There's a chance that 386BSD might not have happened. Linux almost definitely wouldn't -- it was 3 years later.
Then IBM's package of a fancy new PCs with shiny new ports and graphics and memory, with a fancy multitasking OS, plus a fancy new 32-bit bus, would have been compelling.
But it dropped the ball. It insisted on the 286 being included.
On the subject, I am typing this response with a PS/2 keyboard from 2003. I use an old computer with old software. Which is why dreamwidth blocked me.