Absolute dominionMicrosoft has come a long way from the garage where it was born. In little more than a decade it has taken over the world by storm, eventually driven by a vision, that of a computer on everybody's desk, with accurate colour, moving images and sound. It was a daring vision made possible by parallel developments in hardware. |
Ruthless in its business practices, Microsoft took on every company that doubted or threatened its hegemony, not by making the better widget or better software but by suing it to death, untill no other way out existed than to co-operate fully and to subdue to the biggest bully in the field. Microsoft also sought uniqueness by disregarding existing standards and thus taking refuge in proprietary software, the very mistake that also killed the big computer industries.
"Neither the 1994 consent decree nor the 1998 antitrust conviction have caused any change in Microsoft's behavior. The big OEMs do what Microsoft tells them to, and will continue to do so as long as it holds a monopoly. This is another chicken-and-egg problem; only a vendor who can survive without selling any Microsoft products can afford to displease Microsoft."But Microsoft did not make the PC a proprietary product. Or did it? All Microsoft PCs are based on Intel chips and it has very tight business agreements with Intel. Still, the chip does not exclude other software to run on it, but will this remain guaranteed?
Microsoft has been accused of uncompetitive behaviour, and has been sued in court, but to no effect. Today, Microsoft is so large that nobody can stop it. Or will it die by its own hubris, like Rome did? Will there be an uprising, a revolt? Or could it die by its own mistakes?
"Microsoft doesn't get called the Evil Empire for nothing; they know every monopolistic trick for suppressing the competition there is, they are utterly ruthless about using them, and it has already been shown that they can successfully evade anti-trust consequences."
Three familiesMicrosoft grew large on the DOS single tasking operating system which had as its 'operating system' the BIOS (Basic Input Output System) located in non-volatile memory. A DOS machine thus always starts up with a basic intelligence. The application software relied on BIOS calls and some ad-hoc drivers. It was never a true operating system like UNIX, which existed already in the mid-1960s. But it made many applications possible, including games. |
The big improvement came with the Pentium chip (i586) which outperformed its predecessors by such a large margin that it made Windows95 possible, soon replaced by Windows98 and 98SE and WindowsME (millennium Edition). We were now experiencing hyperinflation in software code that required memories of over 256 MB and disk space over 2GB!!! Remember that DOS ran in a mere 1MB with disk space first only on floppies but later up to 10MB (Winchester disks).
The Win9x family was notorious for its unreliability, so that quite independent from the Win9x family of operating systems, the NT (New Technology) was developed. This branch of Windows has a proper operating system and a better file system as well (NTFS, NT File System) rather than the FAT (File Allocation Table) file system of DOS. But NTFS comes at a price.
Later an in-between was born, the Windows 2000 (Win2K) family. Now the NT is being phased out for 2K.
WindowsXP is a bit of an odd ball, since it has a small NT kernel which emulates a DOS machine on which Windows runs. As a result it can be intolerably slow, requiring the fastest of chips and large memory to hide its shortcomings.
Dangerous bedfellowsYes, '1984' is here, a few years later than George Orwell predicted in 1949 but already half a decade a sinister reality. Windows in all its forms is the most dangerous spy ware that allows very powerful organisations and individuals access to any computer that runs it. 'Orifices' or 'back-doors' offer acces to those who paid for this privilege or who 'hacked' themselves in. |
Can Linux become a door for spies? Not likely, say the Linux developers, because Linux is 'open' software and a spy-door would soon be discovered, besides, there are many distros and you would need to make orifices in each. But what if you purchase your distro (Linux distribution) from an American company? Who can check that it is spy-proof? Obviously we will see a shift away from American providers. For the moment Linux is safe because it still has so few users, is open software and has several distros.
BloatwareWhy does the world accept that a hardware upgrade is needed each time a new version of Windows sees the light? The reason for this is partly stupidity, partly by design, partly by the 'free' market and partly because civilisation becomes more complicated. |
The microcomputer did not win because it was better but because it was cheaper and this philosophy filtered down to every part of the operating system and all software, until it became a liability. For instance, in printers alone there could well have been over 10,000 different printers. First these talked over the parallel printer port according to a standardised protocol (Centronix) and character set, but soon variants crept in. When word processing demanded output of grey scales, variable pitch characters and lined boxes, the protocol was no longer sufficient. Each printer now required its own factory-built driver. As printers became less and less intelligent, their processing was taken over by their drivers inside the computer, and these drivers became more and more specific. To make matters worse, the differences between the various editions of Windows were sufficient to require separate drivers for each windows version and each type of printer. The legacy of the lack of printing standards became a huge burden and this also applies to all other hardware.
There is something magical and unexpected about standardisation and limiting one's choices with care, the result of which is more choice rather than less. Think of Lego or Meccano. With care these toys limited the dimensions and pitch (modulus) of their sockets such that it made any combination possible through modularity. The problem is that standardisation has not kept up with the flight of technology, and Microsoft carries a lot of the blame.
The golden promise of computing was that if the basic modules had been
chosen with care, a layer of other modules could be constructed above it,
using these modules, and above that the application software and above
that meta-software (like GUIs). The software heap would look a bit like
a pyramid, becoming ever more capable and intelligent as the pyramid grew
wider and taller. A small increase in the pyramid's size would equate to
a leap in capability and intelligence. But this did not happen, as in the
DOS period, every software house made its own modules with its own oddities
and bugs.
Windows brought a fresh approach by providing many standard functions,
accessible by documented API calls (Application Program Interface). Tragically,
this also locked software and hardware vendors into the MS hegemony. Even
so, much duplication occurred, to the extent that it would be fair to say
that about 98% in your computer is duplicated. In Linux systems this fares
considerably better.
We are using bigger media now. A photo from your camera is some 10-30MB
but can be compacted to 3-7MB without much harm. A 2 hour video, highly
compressed, requires 4.5GB. Uncompressed in the cutting room, a whopping
50-150GB! So you can see where the memory and disk space go, but why would
an operating system like WinXP devour so much memory? My own story must
be recited here.
About a year ago I was still
using an IBM Aptiva computer (500MB, 600MHz) with my main work horse Corel
Draw and PhotoPaint version 7. To sharpen my tools, I bought Corel version
12 but nowhere it said that this new version would not run on Win9x.
I had to upgrade to WinXP. But this operating system is so bulky and slow
that it would take 20 minutes to print a picture that would normally take
1 minute. I could not even get through a day's work. So I needed a faster
computer with more RAM and a printer with USB capability. Once I loaded
that machine up, it needed to re-register. But because the machine had
changed substantially, and MS OOBE (Out-Of-Box-Experience) detected this,
I was accused of pirating WinXP and was threatened with some heavy-handed
legal action. Imagine my anger. WinXP had cost me already a few thousand
bucks in upgrades, and I never wanted this slow dog anyway!
Now I'm back to Corel version 11 which runs happily on all flavours of Windows. But that did not end the story. WinXP was still slow, but I managed to undo all 'candy' with a most useful utility that is not shipped with WinXP: TweakUI.exe. Don't miss it from the Windows web site. I managed to make the situation bearable, and with other tools in place, I think I am now slightly better off, but at a considerable expense. Next to my desk, my partner runs a Toshiba Tecra 8100 laptop, a well-designed package with 256MB RAM, 600MHz clock and Win98SE, and everything there works just as fast as on the much faster and bigger computer I now have and it starts up much faster too. Figure it out! |
Another important factor causing bloatware is that society has become more complex. It begins with your computer box. Whereas early computers had a couple of serial ports and a parallel printer port, your box is teeming with USB ports, a mouse port, special sound processors and a whopping display driver, and many more features you'll never know are there. For each a supporting driver is needed, even though you may never use any. Your box may in fact contain three to four processors.
Another complexity comes from our cultural differences. For instance, early computers were happy with 64 characters of text, but this soon expanded to 127, as the standard ASCII set (American Standard Code for Information Interchange). To accommodate all 'latin' languages with strange doodakies like éîûÑ, 255 letters were just sufficient. But now we need to accommodate Chinese Kanji with over 20,000 tokens and languages that are read back to front, up and down and so on. It would be plain stupid to let these complexities trickle down to the deeper layers of the operating system where speed is paramount. To have screens display Kanji is already a major juggling act.
But all this does not explain the XP operating system's need for over
300MB of RAM. This is needed to optimise slow dog WinXP. To achieve marginally
faster disk access, it reads the disk's directory and FAT (File Allocation
Table) into memory, and if you have 6 partitions, that amounts to a lot
of space. It also keeps the very bloated system registry, Window's nightmare,
in memory. The problem is that this memory space is non-negotiable.
By comparison, Linux uses the technique of disk-cacheing, using all
unused memory. Cacheing is taking the actually used disk sectors in memory
so that the computer runs faster and faster as it is being used. When memory
is needed for applications, it is freed up until no more free memory is
available.
Windows downfall is that it also has a virtual memory allocation scheme
where memory is swapped to and from the swap file on disk. But this software
still has the same bugs as in DOS, and makes the computer slower and slower
as the day progresses, and may cause serious crashes.
UnreliabilityOne of Windows' worst nightmares is unreliability, some of which is not entirely its fault. It grew like a cancer, in an uncontrolled way, and as it is not a true operating system with a clearly defined and protected kernel and protection between application modules, a run-away program can corrupt the entire system. |
Some of its early vulnerability occurred because it allowed earlier versions of system software to overwrite later versions as new applications were installed. Now all new software must first be 'certified', a cost that small developers find hard to bear.
Let me recount another nightmare.
WindowsXP has the name of
being much more stable than previous Windows versions, but this is an illusion.
It has a crash handler and 'system restore'. In other words, as it crashes,
it restores to a previous version (snapshot) of the registry and continues
working after 'making repairs'. Part of this system restoration are mysterious
folders 'System Volume Information' (SVI) on every disk drive. When you
make changes to a document, and save it 'in place', it is in fact saved
somewhere else and the old version remembered in the SVI. During a restoration,
you will lose your recent work as your document mysteriously reverts to
an older version. Working with photographs, I have lost literally hundreds
of them. Now imagine that these are the only ones you have, from a digital
camera. But the disaster does not end there.
Initially I used a removable USB mass storage drive of 160GB to back my work onto. I discovered that photos went missing, and one particular case was so bad that I figured out what was happening. Windows had been 'making repairs' to my backup drive, even though it had been off-line for a week. So the moment I connected it up, I lost files. Now let's not mince words here as this is a very serious bug-by-design. An operating system should in no circumstance write files on a backup disk, other than the actual data and its file directory. Likewise a backup or synchronising program should never write any directories or files of its own. But perhaps we can turn restoration off? Right-click on 'My Computer' > properties > tab System Restore. There is indeed an option to turn it off. Better still, to do this selectively for each disk drive. Unfortunately, it doesn't work and SVI is still written onto every disk. You can now magine that this is my most compelling reason to give Microsoft the boot and to try Linux. Here is what a day's work looks like: After about 20-30 photographs, WinXP begins to run out of memory, noticeable by heavy disk swapping and a remarkable slow-down. Would I still be able to do another ten phtos? No, because then WINXP crashes in a most horrible way, summoning itself a reboot. And I will have lost a lot of work. So, the lesson is to close all applications without delay, as soon as disk swapping occurs, and to reboot the computer. A good day's work will see two or three reboots. But this is still 'workable', wouldn't you think? |
How does all this compare to Linux? Linux is a properly designed operating system created in co-operation as opposed to competition. From the ground up it is based on good open standards, and it is extremely well documented. The heap of software resembles more that of a pyramid, with little duplication. All applications try to use the existing module libraries as much as possible so that they are tested in all possible ways, ultimately leading to high reliability. A Linux sytem for instance, needs a reboot very rarely and can be left running for years on end without any trouble. Only in the period that new applications are installed, is some instability experienced, mainly because of 'dependency' problems where a new application depends on a later version of a module library, which was not yet installed.
An entirely different kind of unreliability comes from viruses and worms
infecting the operating system. This is possible because of Windows' inherent
security flaws and designated back-doors and because of Internet Explorer,
and because so many Windows hackers exist. Remember that Microsoft employed
thousands of programmers, so internal vulnerabilities are widely known.
An immediate remedy is to get rid of Internet Explorer, although it is
not advisable to uninstall it. Use Mozilla instead.
The nature of viruses is that they spread over the Internet, and if
there are enough interlinking e-mail addresses, the virus is able to cause
a pandemic, much the same as a real virus in an overpopulation. As far
as Linux is concerned, the chance of viral infection is very much smaller
than in Windows. This also goes for adware and spyware.
IncompatibilityIt is a bit strange to claim incompatibility of Windows files, for isn't Windows the de-facto standard and by definition compatible with itself? Then why should we worry? |
Adobe has just recently submitted its PDF format for acceptance as an international document standard. But will it accept proposals for change? Will it commit others to adhere to standards in which Adobe has a clear lead over others? Will it accept that other software can produce PDF documents without licence penalties? The standard already looks like bloatware.
Most users of Word are not aware that 99% of their requirements are covered by the well-documented Rich Text Format (.RTF) which is also very much more compact, and better suited to sending over e-mail. What's wrong with using typewriter format (.TXT) a bit more? This is after all, the most compact and standard of all. Likewise, the standard and well documented Hyper Text Markup Language (.HTML) is particularly compact where images are included, and is also most suitable for e-mailing.
Incompatibility in another sense is particularly now a hot issue, as earlier Microsoft Office versions will no longer run on Vista. The document format has also changed and is no longer compatible. What does that teach you? Isn't it time to move to Open Office software that respects 'open document' standards that will still be readable in the next millenium?
Digital Rights ManagementDigital Rights Management (DRM) is now part of Vista. What it means is that the State, the Police, the Judiciary and the Law are now part of the operating system, as if the computer can in any way be held responsible for its actions. What it means is that you are no longer permitted to make copies of what is not deemed yours. You cannot make backup copies of music and videos that you own, simply because you are not deemed to 'own' these things. The powerful media industry seeks means of distibuting their magazines electronically by subscription - no pain here. But it also prohibits you from doing anything more with your subscription than reading the pages. |
Let me be very clear about this. An operating system is just a blind slave, whose task it is to do what it possibly can in the most efficient way. To let Morality, Religion, Beliefs and Law enter its code is sheer lunacy and an invitation to let the (Police) State into the sanctity of your home.
I have mentioned before how the bullies, the hegemons of the USA have been coercing software houses and equipment suppliers to serve their narrow interests of money and power. Anyone now who is powerful and wealthy enough, can influence the behaviour of millions of computers. The door now stands wide open not only for spying but also for propaganda, selective filtering of Internet access (anything you read), and any evil you can imagine, to be part of the system you buy.
So how is this in Linux? You will see in further chapters that here too, the bullying goes on, even though the Linux community is proud of its independence and freedom. Some distros already inhibit you from viewing Content-Scrambled (CSS) movies, the ones you buy or hire from the video shop. Imagine not being able to view a movie on your computer, simply because of this kind of bullying.
No system backupIt is inconceivable to me how businesses can tolerate Windows without proper backups. In recent years I've heard horror stories that were simply unthinkable twenty years ago. How is it possible that the World has embraced an operating system that cannot make a proper backup of itself? |
The problem with any operating system is that a backup program must run with the operating system active, which means that it cannot access files that have been locked. That means that a backup of the C: drive can be done only from DOS on a floppy disk or so. But the operating system is the only agent capable of unlocking and freezing system files for the purpose of backup. Therefore system backup MUST be an integral part of the operating system, unless copying of locked files is permitted. My prediction is that Microsoft will never provide one, as it has more its own interest at heart than that of its customers.
I'm not talking about backing up other drives, which is more straight-forward. The lesson learnt here is that it is important for every Windows user to keep own documents on other drives. Do not use the 'My Documents' folder - ever!
But there is some good news. A Balkan state company, Paragon, has produced
a reliable backup progam (Paragon Drive Backup) that makes backups of entire
partitions while "hot-processing", thus by-passing the file locking mechanisms.
The result is burnt onto a DVD which is also a bootable disc with a restoration
manager on it. I've been using this program since it first arrived and
it has saved me on many occasions. It also reliably backs up and restores
Linux partitions, even though it is a WindowsXP program. A 7GB partition
compresses onto a 4.5GB disc, which is burnt in less than 15 minutes. Note
that the PDB restoration utility on bootable DVD can move the NTSF MFT
Master File Table, and enables you to resize the partition as well.
Cost: less than $100 includes a DVD burner = well worth it.
NTFS
and the Master File Table MFT
The FAT (File Allocation Table) filing system was good for small systems, even though the FAT32 design could address large volumes. Better designs were available since UNIX and later Linux. But Windows re-invented the wheel with its NTFS (New Technology File System) which uses a relational database of very large size with very large records, suitable for very large volumes and RAID drives. It is slow and cumbersome and excessive but worst of all, it cannot re-organise itself. So for every new file, a new record is created, and when the file is deleted, the record is marked 'obsolete', never to be re-used. In this manner the MFT keeps growing and growing. In addition it can become fragmented in two ways: by creating new MFT database extensions that are scattered over the drive, and by becoming 'skewed', requiring many lookups for some files. One would have thought that Windows NT, XP, VISTA, .... would have dealt with this by providing reorganisation utilities, but this is not so. The result is that EVERY Windows system will eventually grow to a slow grind, using up all computer resources. Rumour has it that Windows is now working on this problem (Jan 2010) and that Paragon has the solution already. |
Limits to complexityIt is thought that one of the reasons ancient civilisations collapsed, is that they reached a level of complexity that could no longer be supported by the working classes and peasants. The same applies to Microsoft as it thought that software problems could be solved by hiring more programmers. Literally tens of thousands of programmers have been working on the various MS software projects, resulting in a heap of code of which nobody knows how it all works. The originating programmers have long left with little documentation to guide their successors. |
With a complexity inherited as far back as DOS, it is my opinion that Microsoft has reached its limit of complexity some years ago. The four year delay in the arrival of Windows Vista, does not bode well, even though Vista offers little more than XP. I foresee that Vista has the same bugs as previous versions, but perhaps plastered over a bit more. The registry was to be abandoned and placed in a full-blown relational database, at a low level in the operating system (it never happened). This must make Vista even slower than XP and more memory-hungry as well.
I fear (glee?) that Vista may well be Microsoft's last version of Windows, since the company will not be capable of making the next jump, which is now aimed at 64-bit-wide chips. It could well be that Microsoft's future operating systems will also be based on Linux.
But Vista will be sold to people who buy their first hardware, and what standard can they compare it with? In the meantime businesses, professionals and enterprises will cast a wary eye, not keen to do the experiment soon. It could well be that the jump from XP to Vista is more daunting than the jump from Windows to Linux. We'll see.