GeodSoft logo   GeodSoft

Linux, OpenBSD, Windows Server Comparison: Staff Issues

It appears to be widely accepted that UNIX administrators cost more, i.e. have higher salaries on an average, and that there are more Windows administrators available. I have no basis for disagreeing with either contention. I do however, strongly disagree with the conclusion that is often drawn from these apparent facts, which is that UNIX machines are more expensive to administer than Windows NT or 2000 machines. What matters is not the salary of a single staff or average salary of staff that support a platform, but the average support cost per machine or user.

I'll use two hypothetical examples: a $70,000 per year UNIX administrator, who supports 25 UNIX servers, is much cheaper than a $50,000 per year Windows administrator, who manages ten servers. Likewise one $85,000 UNIX administrator, who manages 10 UNIX servers with 2000 users each, is a bargain compared to ten, $50,000 per year windows administrators, who manage 10 servers each with 200 users. Both of these situations are hypothetical, but if they were real, then obviously UNIX would be the cheaper platform, based on machine or staff support costs, in these circumstances.

When counting internal systems where user accounts and logins, possibly separate e-mail accounts, and other application systems where clearly defined individual account names are matched to specific employees, counting by users is likely to be the better measure. Servers are likely to be a better method for counting where public sites are involved. This would include static information only web sites with no user tracking to full e-commerce sites where a single customer, defined at the application level, may account for one transaction a year or hundreds of transactions per month.

I've never seen any figures on the cost to support contemporary servers of any platform and it's probably just as well. I recall from the very early 90's, the cost of PCs was frequently discussed. I saw various numbers that ranged from a little under $10,000 per year to about $15,000 per year. I always wondered what these numbers were supposed to represent. No one ever broke them down in any useful way. Did they include networking and a share of the server costs? Did they include any allocation for costs of midrange or mainframe applications accessed from PCs? What about consulting and new enterprise application development?

As the IT manager at a very large and fairly well-to-do non profit, I knew what our total IT expenses and capital budget were and our total IT expenses per employee were about a half to a third what the industry was supposed to be spending on PC support. We may have been doing a very efficient job supporting our PCs or perhaps a very poor job. My knowledge suggested that we were below average for the entire computer industry but not significantly out of line for small businesses or not-for-profits. Perhaps these numbers included things that almost no one would normally think of as PC support such as an allocation midrange and mainframe application support costs.

I suspect that these numbers were in some way useful to the large corporate clients that paid the research / consulting firms that developed these numbers. If you knew assumptions and methodology, you'd also likely reach similar conclusions so the numbers weren't fake but the studies were more or less created to reach desired conclusions. The numbers never came with explanations or breakdowns, because, if as I suspect, they included every system that touched the PC in almost any way, then no one would give them any credence.

If today, someone claimed the average UNIX server costs $25,000 per year to support and the average Windows 2000 server cost $10,000, no one with any sense would take this seriously, without expecting a great deal of detailed information explaining how these numbers were arrived at. Even if under a specific set of conditions, these numbers were correct, the UNIX server still might be the cheaper solution, as long as it could do 2.5 times or more work than the Windows server. I can't provide any numbers but can draw what seem like logical conclusions based on my own fairly extensive experience with both UNIX and Windows systems.


The first UNIX system that I was responsible for was an AIX system purchased to run a new association management system that I'd played a large role in selecting. At first I hated AIX. For several years on DOS, I'd used command line enhancers that provided basic command line recall and editing. AIX seemed primitive and difficult. I was an experienced COBOL, dBASE, C, Assembler, BASIC and DOS batch programmer, with familiarity with some other languages. I overcame my initial limitations, to learn that AIX was quite different than DOS, but once learned, far more capable. I learned to customize my environment, the menu system provided by our vendor and some of the key system initialization files.

Over time, all routine system administration tasks were significantly automated. Backup simply required inserting a new tape in the drive each day; if the previous days tape was not ejected, this was indicative of a problem that needed investigation. A new user setup script set groups, default printer, enabled or disabled certain login menu choices including access to the shell prompt, and allowed or disallowed remote dialup access, based on a user's department. Database files were examined weekly and resized as necessary. All logs were rotated and logs, print jobs, user queries, and exports trimmed after appropriate periods. User files in public directories that did not conform to naming standards were automatically removed. Nightly loads of the research database and standard periodic data exports were automated. For things that could not be automated, administrators were notified via e-mail when conditions which might require attention developed. These included tape drives that needed cleaning and other system messages, filesystems that went over 90% full, and load averages that exceeded a normal early morning level.

After these functions were scripted, the central system on which all the association's financial and membership information resided, and from which all it's billing and dues functions were performed, required less IT staff management than any other multi user system or device including the Novell file server, the Internet gateway, the e-mail gateway, the cc:Mail server, and two web servers one of which also included a list server. The AIX system was much less work to maintain than the Wang systems it replaced. My experience since, suggests any routine function on UNIX systems, can be automated without much difficulty. Doing so, requires some non trivial knowledge of what the system is doing. It will require knowledge of command line programs and their options as well as scripting skills but these grow with experience and the opportunity to use them.

On Windows systems, very few applications include scheduling options, accept command line options, or have user accessible configuration files, that are necessary for automation. Theoretically a macro might be used, but these are likely to break down if variable input, such as a dates are required by the GUI program. Even if a suitable macro tool can be located, why should programming an obscure proprietary tool, that may or may not have all the necessary functions, be preferable to programming one of a few fairly standard UNIX shells or Perl that is almost universal today.

Further, the scheduling capabilities of Windows NT are pathetic compared to UNIX's cron. Perhaps Windows 2000 has actually included a decent scheduler but I've seen nothing to suggest this. More sophisticated schedulers than cron exist, but in 8 years, I've never had to schedule a job that could not be scheduled on one cron line. The added flexibility that I've seen in other schedulers, is not worth the complexity that comes with the largely unused flexibility. By contrast, the Windows NT scheduler is so limited, that for jobs that repeat more than a few times a day, I normally build the scheduling logic directly into the script, run the script continuously, and include logic so that an external trigger will stop the script. Few UNIX scripts that I write include any code related to scheduling, as that is entirely handled by cron; nearly all Windows scripts that I write, include code explicitly devoted to scheduling issues, to compensate for the inadequacies of the Windows scheduler.

If what I say about routine tasks and scripting on Windows and UNIX systems is true, what (if any) conclusions might be drawn about the staff who administer these systems? Intelligent people and good problem solvers find a high degree of repetition boring and frustrating. Where the opportunity exists to eliminate repetition by using available tools to do the work, the good problem solver will see a challenge to be met, and feel a sense of accomplishment, when the routine task is automated. If repetition is clearly present but there are no tools to eliminate the repetition through automation, the good problem solver will be bored and frustrated, and will eventually seek other opportunities regardless of compensation.

Many persons are quite content with a high degree of repetition in their lives. Are these the people you want administering your computer systems? Can they solve problems when they arise? The large majority of computer administrators who are worth hiring, if given a reasonable adjustment period, will likely be happier in a UNIX world than a Windows world, even if they do not recognize that today, because they only have a one environment view of computers. In this regard, Macintosh computer experience is of no use at all, as Macs go even further to hide the system details from users.

Reliability Impact on Staff

In the six years that I worked with AIX, there were a few disk problems, which system error messages alerted us to before there was a failure. Otherwise, every problem we had, was sooner or later traced to an administrator change or error. After a small adjustment period of several weeks with Linux and OpenBSD, these systems were largely stable; there were a small handful of irregularities (discussed under reliability) including one apparent OpenBSD crash. One specific PC does not seem to work with the X window systems but this is consistent. Generally however, I'd describe both Linux and OpenBSD as highly stable and when I have a perplexing problem I assume that somehow it relates to recent changes. Even if at first there doesn't seem any plausible relationship, so far, I've been able to track down and fix every persistent problem encountered on Linux and OpenBSD.

I've worked with Windows NT as my primary operating system for most of the past five years, and though on a few occasions have experienced a system remaining up for several months, I simply never have any idea when or what might bring an NT system down. Worse than uncertainty regarding when a system might go down, is complete uncertainty what it will take to get it back up. Most of the time a simple reboot is all that is necessary, but I've also had to repartition hard disks, install the OS, and restore from backups. The worst situation I had to deal with, occurred when a power supply failed on my GeodSoft NT web server. I had backups that were up-to-date and two similar, but not identical PCs to build a replacement server on. For a much of two days, I tried unsuccessfully to build a new NT server on the spare PCs, using the backup data. It wasn't until the original PC was repaired, that I was able to restore the server to use.

I've seen enough registry corruption on NT systems, that since mid 1999, I rarely built a new NT system without installing two systems. The first goes into C:\WINNTbak, and is a bare bones install, without even networking, but including whatever backup software will be installed on the production system, installed to the real location. The second, "real" install then goes to the standard C:\WINNT. As long as you do frequent backups and keep the administrator passwords synchronized, it doesn't matter how badly the live system gets damaged (software corrupted), you can boot to C:\WINNTbak, erase C:\WINNT, and restore from the last backup, without needing to do a system install, and be back to where you were as of the backup.

There is little more frustrating than having a new project that you want to work on, only to find that some hours or days must be spent fixing a Windows Server problem, the cause of which is not likely to ever be known or is simply inexcusable, like the metabase.bin corruption described on my page "New Windows Anomalies". Dealing with a growing number of unstable NT servers, was one of several factors, that lead to leaving my last regular job. Unfortunately, as I still believed that Microsoft and Windows was an essential part of my career, I included a Windows server in the mirrored set, and it has been more troublesome than all my other servers combined. When I started this venture, NT was still the current Microsoft server. Since Windows 2000 has been released, my NT skills are losing their value. As I'll discuss further in my conclusions, I see little value for Windows 2000 Servers in many, perhaps most small businesses and certainly not in mine. As I do not wish to support Windows 2000 servers professionally, there seems to be no reason to acquire one, thus it's likely to only be a matter of time before I drop the current NT web server. (This occured some weeks after I wrote this paragraph but before I had this comparison complete and online.)

Not only are Windows system problems much more common than UNIX system problems but situations where no clear answer for problems that have occurred are fairly common on Windows systems and almost unheard of on UNIX systems. There is little more frustrating than dealing with a problem where you can get it to go away, but you don't understand it, and can never be sure when, or if it will return.

Administering Windows systems feels running on a treadmill on which random obstacles appear. No matter what you fix today, there is no telling what you may have to deal with tomorrow, and each product added to a Windows system, is likely to add to the workload, creating a permanent backlog of things that should get done, but don't. Administering UNIX systems is more like climbing a mountain. You write scripts to automate the more time consuming tasks and as your scripting skills improve, you find all the routine tasks have been automated. When a new product is added, instead of being a continuing resource drain, it's a new challenge to automate. At some point you feel that you've finished with that product, and it's just another part of the system and perhaps another line on your resume. You look down (back) and see a specific series of accomplishments, in which you can take pride, rather than an endless series of frustrations that leave you wondering where your workdays went.

A review of the contents of my website is illustrative. With a few exceptions, most of the problem descriptions deal with Windows systems, and most of the specific accomplishments have been on UNIX systems, or programming that is system neutral. In the long run, most of the technical employees you want to keep, will be happier in UNIX environments, if they are given adequate training and opportunities to adjust to changes.

transparent spacer

Top of Page - Site Map

Copyright © 2000 - 2014 by George Shaffer. This material may be distributed only subject to the terms and conditions set forth in (or These terms are subject to change. Distribution is subject to the current terms, or at the choice of the distributor, those in an earlier, digitally signed electronic copy of (or cgi-bin/ from the time of the distribution. Distribution of substantively modified versions of GeodSoft content is prohibited without the explicit written permission of George Shaffer. Distribution of the work or derivatives of the work, in whole or in part, for commercial purposes is prohibited unless prior written permission is obtained from George Shaffer. Distribution in accordance with these terms, for unrestricted and uncompensated public access, non profit, or internal company use is allowed.


What's New
Email address

Copyright © 2000-2014, George Shaffer. Terms and Conditions of Use.