Depends how you look at it really.
Windows 1.0 that was released in like 1982, might look like a far cry from Windows 3.1 (which is really when Microsoft Windows actually took off in homes). The reality is that is was very very similar running on more restricting hardware, doing very similar tasks.
Windows 3.0 has elements of AppleOS in it, due to the [-] Close Icons being on the Left rather than the right. But the fact is that this was a feature since Windows 1.0, in-fact I think more of the similarities were more of a coincidence rather than deliberate.
Actually if you look back both Apple and Microsoft GUI Operating Systems have things that Amiga Workbench introduced.
Taskbar, Right-Click Menu, 'Amiga' Button Menu & Actions, Tabs, Scroll & Combo Boxes, etc.
Workbench 1.0 was by far one of the most Advanced GUI Operating Systems, especially considering when it was released.
IBM-Compatible Machines (Microsoft) and Apple definately lost the public sector to Atari and Amiga until 1994. The problem with Atari and Amiga though were they designed thier computers like consoles.
Expecting people to simply buy an entirely new unit each time a new model was released. This was thier downfall, because you couldn't really customise them (until it was too late that is).
If you ever take the time to look over Windows 95, I doubt it would've been even half as popular if it hadn't been made as simple as Amiga Workbench.
Window 95 was really Microsoft's "Crucial" time. They were pushing themselves forward, not just into true 32-bit computing, but more because for the first time they actually controlled the market.
With Windows 3.1x firmly based as a GUI Front for DOS rather than a true Operating System, everyone could use thier own variation of DOS.
DR DOS, NDOS, SHell, MS-DOS, etc.. most of the time you spent more time actually in DOS still because some programs required the crucial 640k that Windows liked to take up.
32-bit was supported by Win32Software, but it buggy and not native really. More like DOS/4GW, 32-bit Extensions for DOS.
You also have to keep in mind that Linux is young.
While the OS it was created from and technically it's big brother (Unix) has been around since the late 70s, Linux itself only started to appear in 1991, well more like 1992 really given the online communities at that point, it had to filter through quite a few people before it became widely enough available.
Downloading the 538K source back then was a challenge, given you were talking about a good 3-4hours (no SERIOUSLY, when the best modems around were 14400baud, not many people had them either you were more looking at having some of the 9600baud that's what I had atleast). And you could hardly call getting it free either, while sure the source was free and open; 3-4hours online could cost almost £15-20. Hellova lot just for some OS you had to program and build yourself to use.
Didn't exactly come with an instruction manual either, so relying on other peoples distro's like RM, Novell, and Red Hat was the key to expericing it.
What is interesting is really it wasn't until the last 6-7 years that Linux has even had a GUI System X-Free86. So it was hardly starting the race on equal footing.
Main problem you'll see with it, especially against Unix and Windows is the sheer 'compatibility' and 'standards'. I mean it was created in the first place to be a compact and more standardised variation of Unix. Something that could actually be a REAL alternative to DOS; So it is quite ironic that nowadays it's Unix that is really considerably more standardised.
Linux has degenerated in-to effective 3 Key User Groups.
- Cheap Server Solutions
- University Students
- Script Writers (Generally Nerds who think that using it makes them 'cool' and 'anti-micro$oft' .. yeah i'm sure that changing your OS to Linux is gonna get all the girls now.
)
It's not impossible for someone to create an OS to compete against Microsoft, you just have to play thier game. At the same time though, if you think that it's a simple getting even close to the same stability as they have then be my guest.
You see it as unstable, but thier OS will literally run out-of-the-box on basically ANY x86 setup.
Linux & Unix do not have the same ability. While they're compatibly with a large range of 'standard' hardware, if your using something they don't recognise then it'll die.
For example AMD and Intel Processors work a charm, ever tried using a Cyrix though? lol your lucky to boot it up without it freezing.
There aren't the same safety checks to make sure hardware works before accessing it incorrectly potencially damaging it.
The only reason there is more stability is because the hardware supported is more limited, but also generally speaking each Linux builds it's kernel when you install (or update drivers).
So effectively it's like having a personalised OS, but don't expect to install it within seconds .. and don't expect it to run on anything aside from literally Default hardware without recoding some parts yourself.
The Market is too small with the driver interface being the biggest pain in the arse for developers to bother with. Apple were sensible and changed that part of Unix... NVIDIA don't mind updating drivers for them for a realtively wide-scale of cards.
Linux is lucky to get a new build of thier drivers every 6months, and NVIDIA is one of the BETTER Linux supporting companies.