Linux is the open source replacement for Unix, Windows Web server, and a contender to replace the Windows desktop operating system. Linux started replacing my Solaris Web servers back in 1996 because Solaris required Sun hardware and Sun hardware was more expensive but no faster than competing hardware. I placed most of the Web sites on servers managed by Web hosting companies and they preferred the security of Solaris but some were changing to Freebsd and Linux. How useful is Linux today, 13 years after that first use?
Choice of Distributions
Windows used to be a simple choice, just the desktop and server versions. Now there is a huge range of confusing choices. Linux offers a huge range of confusing choices and I hope to simplify the choice on this page.
Linux variations are divided into distributions and many distributions are now on DVD with everything on the DVD, the desktop and server options installable from one menu. You can create any mix using the one DVD. This actually makes Linux easier to install than Windows.
Web site servers were the big new thing in the mid 1990s and Apache was the Web server software of choice. Apache ran on every useful operating system including Linux. Windows and Linux gave you the widest choice of hardware. Brand name replacements for Sun Solaris servers were from 20 percent to 50 percent cheaper than the Sun software and hardware combination. You could set up an effective Web server using generic components for about 70 percent less than the cost of a Sun Solaris equivalent.
Today you can set up Linux on almost any hardware. New hardware arrives with updates for Windows and the most advanced Linux distributions implement similar updates within weeks or a few months. If you are not at the bleeding edge testing unusual hardware, you can use Linux. You get the best choice of competitive hardware and an operating system with no licence management problems.
Occasionally you might have to wait a month or two to get new hardware working but that is not longer than tracking down all the weird errors in a new version of Windows. You never know how long Microsoft will take to fix an error and that is the same with open source distributions. If your Linux does not recognise the new network chip in a new computer, as happened to me recently, you can slip in an old network card for a few months or experiment with other Linux distributions. I will point you to the better distributions for handling new hardware.
Prior to 1996, my only use of Linux or Unix was in servers and dedicated devices with BSD style Unix dominating the dedicated devices and Solaris dominating the Web servers. Everything else was Windows or NT. Solaris was rated the most secure Unix and Netbsd second. To make Linux as secure as Unix, you had to run a procedure named Bastille and Bastille had limitations. Unix was the king of security.
Unix fell to Linux because there were many versions of Unix with few receiving adequate attention. There were a million people improving Unix and only a few tens of thousands improving Linux but everyone working on Linux was working on the same Linux. Some versions of Unix were so specialised that only a few dozen people, out of that million, worked on them. Linux became an easy option for replacing less successful versions of Unix.
Unix survived on dedicated devices because their BSD style licence is better for dedicated device developers. The Linux GPL licence is perfect for servers. Linux ate up the Unix share of the server market by replacing the least popular versions of Unix. There is always one Unix that is less popular than the rest and a target for replacement by Linux. Eventually IBM replaced their proprietary versions of Unix with Linux and contributed serious improvements to Linux. Big corporations shifted from Unix to Linux when IBM approved Linux.
Recently Apple copied IBM by throwing out their Apple operating system and adopting an open source operating system. Apple decided to use an operating system with a BSD licence so they could paste the Apple label on the free operating system and charge money for the result. That leaves just Apple and Sun trying to keep Unix alive on general purpose computers.
My first attempt at replacing a Windows system with Linux is not recorded. I recently threw out some old floppy disks and one was labelled as an Nvidia driver for Linux created on March 16, 2001. The actual floppy disk was marked as put into service on June 11, 1995, which was about the time a lot of Web hosting companies started offering Linux as an alternative to Solaris and I started using Linux on test servers.
Back in 1995 I probably used the floppy to download and install a driver for Linux because those versions of Linux were squeezed into one CD and one CD was not enough to store all the hardware drivers needed to get computers working.
Linux was rarely a replacement for an existing server and was usually an alternative for a new server. If I had to split a Web site off a general purpose server, I would set up the new server with Apache on Linux or NT. NT often won because NT had better installation procedures and a better default file system.
2001 is a long time after I started setting up NT and Linux based Web servers. 2001 is about the time I was deciding between an upgrade from NT to Windows 2000 or to put everything on Linux. Weeks of work on Linux produced really basic Web servers and desktop computers but nothing to the standard I required. What I could not achieve with a week of work on Linux, was easy to do in 30 minutes using either NT or Windows 2000.
I am still persisting with Linux today and, for 2008, I will get a Linux server running in my office for something other than a basic Web server. I will set up a workstation to replace my Windows workstation. I will use the latest hardware and I will set up all the reliability features I use in Windows. I will try to set up everything using normal installation procedures and not the Unix black screen of command line death. 2008 is the year of Linux.
Update January 2011. I have Ubuntu 10.10 running on several computers next to Windows XP on a computer with identical hardware. The Windows XP machine is used more than 15 hours every day. The Linux machines are used only a few hours each day. The XP machine is used with up to 60 applications open. The Linux machines are used with only one or two applications open. The odds are really stacked against the Windows XP machine. Each Linux machine crashes once or twice per day. The XP machine has not crashed for many months. The combination of Linux and applications supplied in Ubuntu 10.10 is as unreliable as Windows was back in the 1980s.
Black Screen of Command Line Death
The one drawback with Linux and Unix is reverting to the Unix command line. If I wanted to use a command line, I could use DOS from the 1970s. One of my teachers described working with command lines through teletypes in the 1950s. The earliest known computer, a mechanical device from over 2000 years ago, had a graphical user interface so why should I have to learn to use the command line today?
Unix and Linux users complain about something in Windows named the Blue Screen of Death, BSoD, but it is so long ago that I last saw a BSoD that I can not recount the circumstances. I frequently see Linux systems devolve to their command line form, the Black Screen of Death, BSoD and I often have to visit the same screen to fix problems created by errors in the Linux administration programs. You cannot heap rubbish on the Windows BSoD then expect people to convert from Windows to Linux when almost every Web page about Linux points you into their BSoD.
Fortunately there are Linux distributions where the use of BSoD is rare.
Windows has a home version that uses the FAT file system and a professional version using NTFS. NTFS offers full security plus improved efficiency and reliability. Old versions of Linux defaulted to a file system named Ext2 that was better than FAT but not as good as NTFS. Today most versions of Linux use Ext3, a great replacement for NTFS.
There are only three problems remaining with Ext3. The first is the lack of a space in file names. In Windows you can name a file caterpillars spring 2007 but in Unix and Linux file systems you have to replace the spaces with something else, caterpillars-spring-2007.
The second problem is that Linux and Unix do not understand case. In Windows you can type a file name as motheggs, Motheggs, MothEggs, or motheGGs, you still get the one file. When you type the file name with a different case in Linux or Unix, Linux and Unix look for different files. You can spend hours looking for a file in Linux if you do not know how the file name was typed. Fortunately some modern Linux equivalents to Windows Explorer are set up to overcome the lack of understanding in Linux and find files no matter how the file name is typed.
Your memory cards in your camera, telephone, music, and video players use the FAT file system instead of Ext3. The file names have to be translated when moving files from FAT to Ext3. Linux also detects the type of file by methods other than a file extension, which is different to Windows. Generally Linux will handle the movement of a file from Windows to Linux but may create files that will not work when moved to Windows. Ext3 will be more useful when the manufacturers of portable devices replace FAT with Ext3 and the manufacturers will not use Ext3 until Microsoft builds Ext3 compatibility into Windows.
When you create files in Linux, type all the file names in lower case so nobody has to remember the way you typed the file name. Use file extensions for all files that may end up on Windows. Change the defaults in your applications to create the right file extensions. Always use binary file transfers between operating systems so they do not try to change the data within the files.
You know that Windows sells in different flavours. Linux has equivalents. Windows went through 3.1, 95, 98, 2000, XP, and is now at Vista. Linux uses a simple consecutive number and is now at 2.6.
Windows has desktop and server editions. Linux is available in desktop and server editions plus some suppliers of Linux have specialised editions for education, database servers, network attached storage servers, and some truly weird uses. The versions use the same basic Linux with different collections of optional software packages stacked on top.
The specialised versions of Linux are usually designed to fit on one CD. Some of the general purpose versions of Linux are now packed on DVD and include every option with an installation process that lets you choose what you want. You can use one DVD for every computer in your company.
The wide range of Linux versions is divided into distributions with a distribution having a brand. The brands include Debian, Red Hat, Suse, CentOS, Fedora, Mandriva, and Ubuntu. Some distributions are based on other distributions. CentOS is a Red Hat distribution with the Red Hat branding and advertising stripped out. Ubuntu is a Debian distribution simplified for beginners setting up Linux on desktop computers.
Debian is the mountain of Linux, solid and slow to change. Debian offer the reliable secure stable version and a frequently updated testing version. The testing version is a monkey, quick to learn new tricks but not as reliable. Use the stable version first and when the stable version fails to recognise new hardware, try the testing version.
The Debian installation process offers more flexibility than most Linux distributions and is slightly easier to use when setting up anything other than the simplest disk configuration. The Debian default disk setup is better than the default disk configuration produced by other distributions. Debian is the only version of Linux I can get to work on flash devices. The testing version of Debian is often the first Linux distribution to work reliably with the latest hardware. Currenly the only close compeditor is Fedora.
The Debian testing version can be updated weekly using Jigdo which means you could reload a rewritable DVD every week and always install the latest version of everything.
Red Hat focuses on the corporate server market and packages Linux with proprietary products and support. When you want a corporate server without support, look at CentOS or Fedora.
Red Hat do not produce an equivalent to the Debian testing version. Instead Red Hat sponsor the Fedora project to test the latest software. Fedora is also a better choice for desktop computers.
Suse used to be a market leader in Europe then they were taken over by Novell. I do not know who chooses Suse today. There is a community version of Suse named Opensuse and Opensuse works in a similar way to the Fedora project.
CentOS is the most popular version of Linux for building servers when your budget does not include money for Red Hat support. Many people grow up using Red Hat Linux without calling Red Hat for support so why pay for support? You can use CentOS instead of Red Hat and save a few dollars.
Debian is the main alternative to Red Hat and they use different directory conventions plus a few other differences. If some of your computers are using Red Hat then you want all of them to use the same conventions, which makes CentOS a better choice for you than Debian.
I find the CentOS installation procedure easy but the disk configuration defaults are more complicated than needed and CentOS does not work with flash memory devices. I prefer Debian for RAID creation and Fedora for the better installation interface.
Fedora is a community project sponsored by Red Hat and Red Hat use Fedora for testing new software, much the same as Debian use their testing version. On a small number of tests, Fedora recognised hardware before CentOS or Debian but Fedora missed some items recognised by the Debian testing version. Fedora was ahead of Ubuntu at the time of the tests.
Fedora now has a respin service where updates are applied and made available through Jigdo. The Fedora Jigdo update is new, irregular, and not at the automated weekly level of Debian. When Fedora have automated weekly updates, there will be little to choose between Fedora and Debian.
Windows 2000 beat Linux for easy installation back in 2001 and the nearest competitor was Mandrake Linux. Mandrake Linux is now named Mandriva Linux.
Mandrake was based on Red Hat Linux and included a better installation procedure. I used Mandrake for several projects up until their release 9 but Mandrake was not at the stage where it could replace more than my most basic Windows computers.
I recently tried using the latest Mandriva Linux to build a NAS device, a Network Attached Storage server, and Mandriva failed to fit the task. Debian and Fedora are better fits for a NAS server and just about everything else. Ubuntu, based on Debian, is a better choice than Mandriva for easy desktop installation by beginners because Ubuntu has a large support community working through forums.
Lots of people convert old desktop computers to Ubuntu to learn Linux. They usually keep their newest computer on Windows. Some people report success with dual boot computers containing both Windows and Ubuntu but there are as many failures reported as successes and you cannot use both operating systems at the same time.
I tried Ubuntu a few times and it works on old hardware but not new hardware. Ubuntu is based on the stable version of Debian, which supports hardware about a year or more old. Try Ubuntu when you buy a new Windows computer and you are wondering what to do with the old machine.
There is a testing version of Debian that is updated every week and works with most of the latest hardware within a few weeks but Ubuntu does not have an equivalent. Debian is available as a DVD with every option included, which is, for me, a better choice than the smaller Ubuntu CD because Debian offers all the installation options I need on one disk. I suggest learning on Ubuntu then moving to Debian for flexibility.
I like the way Ubuntu handles unknown displays. Ubuntu tells you that the monitor type is unrecognisable while other distributions leave you in the dark. Ubuntu pops up a window where you can manually specify the monitor characteristics while the other distributions make you work like a witchdoctor using ancient DOS commands to edit a configuration file. May all the other distributions learn this one trick from Ubuntu.
Update October 2010. See the notes about Ubuntu 10.0 in Linux Speed.
All Distributions of Linux
Some distributions of Linux only give you control of the installation process if you use a special startup option. My comparisons of Linux distributions include features that may require those special startup options.
Linux is miserable at RAID on the desktop workstation compared to Windows. On Windows you get RAID free with the server versions of Windows and it almost always works exactly how you expect without any special configuration work. The desktop version of Windows usually gets RAID support from a hardware assist chip with a special driver and that type of RAID usually works exactly as expected without surprises.
Linux RAID has all sorts of problems. Some server oriented distributions have acceptable configuration tools but many do not and almost none of the desktop oriented distributions install RAID or provide an easy option for RAID.
The most common recommendation in Linux forums is to use dedicated hardware RAID instead of Linux software RAID. Dedicated hardware RAID uses a second computer to isolate the disks and perform the RAID work. You duplicate the software, processor, and memory on a special plug in card. What a waste of resources. Some of the add-on cards are more expensive than a whole desktop computer. Linux needs better software RAID configuration and management tools.
A few tiny changes to Linux would make Linux RAID workable and the changes appear to be only in the installation process, not the working software. Linux forces you to create partitions on disk then convert the partitions to RAID. You should be able to select RAID first then point to a set of disks and have the matching partitions created automatically. As an alternative, create native partitions on one disk then tell RAID to copy the set onto the other disks.
After you install Linux, there is nothing to add, change, manage, or restore RAID arrays. There might be some weird command line stuff for coffee drinking, pizza eating Unix buffs, with or without pony tails, but nothing for regular computer users. Debian is the best choice for RAID with Fedora and CentOS running a close second, just do not use the default CentOS or Fedora configuration because it is overly complicated.
Update October 2010. The Ubuntu 10.10 alternate edition is equivalent to the Ubuntu desktop edition and has added RAID plus an expanded Debian style installation process to let you use RAID. The Alternate edition installation process is slightly more complicated and not recommended for beginners who do not need RAID.
You need an MBR on your computer. Windows automatically creates the correct MBR, you do not have to know what an MBR is because it always works. In Linux you have to read hundreds of Web pages and forum posts to find out how to create a working MBR and the creation process usually requires the use of ancient command line instructions. Fedora comes closest to creating MBRs reliably but only if you use complicated options during the installation procedure.
Linux and Unix advocates are quick to dismiss Windows as
DOS with something stuck on top but then they tell you to fix everything in Linux by using the Linux equivalent to DOS, the command line. The Linux advocates need to replace the black screen of command line death with something at least as good as the 1970s style Debian installation process interface.
You should be able to specify where the MBR is created because most distributions of Linux place a new MBR on the wrong disk when there are several disks. You should be told where the MBR will go have the opportunity to select another disk.
Many Linux distributions do not tell you when they change an MBR. Some Linux distributions tell you they are changing the MBR but never actually make the change. This is one area needing drastic improvement because diagnosing the problems is extremely difficult and there is no easy way to fix the result. Debian and Fedora produce the least problems in this area and give you the most power to override their mistakes.
A boot loader is the fourth thing you need to start a computer after electricity, the BIOS, and an MBR. Modern Linux distributions use GRUB but the GRUB installation process is primitive and almost useless with RAID. You have to perform multiple complicated processes to get a working GRUB for anything other than a single disk based computer. You have to fix many of the GRUB problems using the ancient black screen of command line death. If I wanted to fix everything in DOS, I could go back to Windows 3.1. Fedora installs GRUB reliably on most occasions and gives you an override option to correct the few wrong choices made by Fedora.
When you install an operating system on many computers, you want to use the latest update but do not want to download a whole DVD every week. You could install a mirror on one of your servers then install to new computers using the mirror. Jigdo is the new alternative to network installs.
Jigdo downloads the whole DVD on the first download but instead of downloading a finished image, it downloads a template then all the components then builds the image. When you want an updated image, Jigdo can download an updated template then download just the components that change.
Debian offers weekly updates via Jigdo and Fedora recently started supplying updates using Jigdo but not as frequent. You do not save anything on the first download. Subsequent updates are only a few hundred megabytes instead of downloading the whole 4 gigabytes again. Use Jigdo if you have more than a couple of computers.
People keep telling me that Linux is faster than Windows. When I test Windows and Linux side by side with the same software and hardware configuration performing the same work, there is little difference. Read the details in Linux Speed.
Update October 2010. Ubuntu 10.10 has significant speed improvements over earlier versions of Linux on my hardware. It might be improvements to Linux or better hardware drivers.
Use Debian, Fedora or CentOS Linux for your Web servers. Use Debian or Fedora for everything else. Use the testing version of Debian for new hardware. Give Ubuntu to people who want to learn Linux at home and point them to the Ubuntu forums for support. Use Jigdo, where available, to download distributions then download just the updates for subsequent installations.