GeodSoft logo   GeodSoft

Linux, OpenBSD, Windows Server Comparison: Summary and Recommendations

If you've stayed with me this far or read significant portions of the preceding sections, what comes, follows naturally. No operating system is perfect or even close, even when limited to server only or desktop only roles. Making any selection necessarily involves making tradeoffs including reducing the choices that may be made in the future. Even the choice to employ multiple server operating systems, which suggests flexibility, will incur certain costs.

I know, that with sufficient resources, Windows NT and 2000 servers can be made stable. On a theoretical level, due to the large number of tightly integrated products, development tools, and extensive application availability, including a variety of third party middleware, application development tools, and environments, it may be possible to "do more" on Windows servers than other platforms. As a practical matter, Windows systems and servers in particular, rarely live up to their promise.

Windows is a bloated system, containing an enormous array of unused features, that may not be easy to disable, and certainly cannot be easily removed in their entirety, that few of its users need, understand, or even explicitly want. It's generally comparatively unreliable, and thus resource intensive to maintain. Windows server purchases are often poor choices because they are made, not based on any careful matching of needs and capabilities with cost comparisons, but simply because Microsoft is the current market leader, and due to general perceptions that whatever needs to be done, can be done on Microsoft Windows systems.

Theoretically, Windows NT and 2000 can be made secure, as it's built in security functions are more sophisticated than standard UNIX security facilities. As a practical matter, however, such an effort will be very labor intensive and result in a machine that will not look like a Windows server and will likely not support functions that may have been part of the reason Windows was selected in the first place or at least are necessary to run the intended applications.

At a somewhat lower level of security, with appropriate measures, it should be possible to secure Windows servers against penetration by any but highly skilled intruders. That is, the network and infrastructure attacks should be closed or fixed as soon as they are found, i.e., patches applied, leaving only sophisticated application level attacks. With the series of recent (Summer 2001) major security flaws in Microsoft's own code, Microsoft may find it difficult to convince the market that its systems are secure. Keeping Windows NT and 2000 servers secure is a job for full time security professionals. The small shops that have bought Windows servers for its ease of use characteristics simply will not have the resources to keep up with patches and keep their servers running given Microsoft's track record with buggy patches.

Given the complexity of Windows, the complexity of the tools and middleware used to build the applications, and the complexity of the applications themselves, it's absurd to think that all the security related bugs can be found and fixed. The owners of such systems can only hope that the holes that exist are sufficiently difficult and obscure, that no skilled malicious intruder actually finds them. To be fair, comparable application level vulnerabilities are likely to exist on any platform capable of deploying comparably sophisticated applications.

Despite all it's drawbacks as a server system, Windows 2000 Server may at times be a necessary server choice, because applications that perform necessary tasks, in some situations, are available only on Windows platforms, and the costs of custom development on better server platforms are prohibitive.

OpenBSD presents contrasts with Windows servers in almost every possible regards. It's elegant in it's simplicity and unrivaled in its basic security features. It probably has the cleanest code base of any server operating system. For a small organization that needs to run standard Internet servers, it's hard to beat the cost and reliability provided by OpenBSD. Further, for organizations committed to developing modest scale custom systems, including basic e-commerce, OpenBSD may be a good choice. There are enough open source languages and development so that anything that can be entirely custom built, can be built on OpenBSD.

OpenBSD does however have some serious limitations. One is scalability. OpenBSD does not currently run on multiprocessor machines and does not cluster. This makes it unsuitable for any environment where either a very large amount of CPU power is applied to the same problem or a significant number of independent machines share the work load of a large number of similar problems. OpenBSD is not likely to be a good choice for any organization that will ever have a total of two dozen or more servers. This is a purely arbitrary number, but provided to give my sense of the upper size limits where OpenBSD might be appropriate as the standard server operating system.

Because of its tremendous security strengths and encryption tools that are not restricted by U.S. export restrictions, OpenBSD may be the ideal choice for much larger organizations, that have a number of locations connected to the Internet, and need a border technology over which they can have complete control. OpenBSD can provide strong firewall and VPN technology with unlimited customization capabilities, for little more than the cost of the hardware, though obviously, extensive customization will have the same kinds of costs associated with any software development.

OpenBSD's other serious limitation is the lack of applications including server applications and development tools. Beyond the standard TCP/IP servers and a modest variety of open source languages, development tools and some packages to serve relatively general needs associated with creating business applications, the choice of server applications and middleware for creating customized applications without large amounts of custom coding is very limited compared to both Windows and Linux. It's unlikely that custom applications, assembled with major portions consisting of pre-packaged components, will be a viable option on OpenBSD servers.

There are few commercial support options available for OpenBSD and few developers and consultants with OpenBSD specific skills. On the other hand OpenBSD is part of the BSD family which are all very similar in terms of system layout, startup scripts and available commands so that most user and administrator skills are transferrable across the BSD family of products. Most books that include UNIX in the title and do not qualify the version will be applicable to BSD systems. OpenBSD generally keeps its man and info pages up-to-date with the product as it's developed.

Linux lacks the technical purity of OpenBSD and is based on a development model that borders on the anarchistic. As a practical matter however, Linux provides a server operating system with reliability and stability characteristics roughly comparable to OpenBSD and far superior to any version of Windows.

The default security characteristics of Linux depend on the distribution and install options chosen. They may vary from as bad as Windows to apparently stronger than OpenBSD. Though Linux has had and is likely to continue to have a number of security issues, they are fixed very quickly. Unlike Windows, Linux is a system that lends itself well to standard hardening practices. With not much more work than OpenBSD and much less work than Windows, Linux administrators can create systems with almost any mixture of functionality and security tradeoffs, appropriate to the situation the server will be used in.

Severely stripped Linux systems than cannot be changed because they run from CD-ROMs or even write protected floppy disks, can and have been made. There are pre hardened distributions, EnGarde and Trustix, and automated hardening tools, Bastille, available. There is even a Security- Enhanced Linux developed by the National Security Agency. This enhances the Linux kernel to allow mandatory access controls to be placed on any and all system resources, a capability not present in any existing general purpose operating system. If the preceding techniques aren't sufficient, adding the Security- Enhanced kernel should allow meeting virtually any security objectives.

Linux has been used to build powerful parallel supercomputers so it unquestionably clusters well and the newest kernels should be comparable to Windows on multiprocessor systems. Even if one could provide convincing evidence Linux was generally slower than Windows and I know of no such evidence, the best way to compare Linux and Windows performance would be on a cost per unit of work basis. Since licensing costs have typically (historically) been one time costs incurred at the same time the hardware is purchased it makes sense to combine hardware and software costs for performance purposes and deal with staff related costs which are ongoing, separately. When you compare equally priced Linux and Windows servers, including the software that is absolutely essential for the hardware to have any value whatsoever, there is little doubt in my mind that Linux would win a variety of performance tasks.

Linux has a very large range of applications, both commercial (proprietary) and open source. As the platform that most open source applications are built on, Linux has by far the largest number of open source applications of any operating system. There are almost no business needs that cannot be met by Linux applications. As by far the most widely used open source system, Linux has the best and most diverse free support available, which is often better than traditional commercial support. If you want to pay for support, there is everything from the limited support that is automatically included with the purchase of any Red Hat boxed set to full support options from major vendors including IBM and a variety of independent vendors that exist solely to support Linux.

Moving Away From Windows

Today Windows, thinking of it the Microsoft way as a single "family" of products, is the dominant operating system. When including desktop systems as well as servers nothing else is even a close second.

When it comes to servers only, I don't have a clue which OS accounts for what percentage of the market because their are so many potential ways to count. From time to time, I have seen numbers that attempt to describe the percent of the market the leading server operating systems account for; I don't recall seeing the measure used let alone the methodology to arrive at the numbers used. Do you attempt to count total installed base or just the past year's sales? When you get to the open source systems, how do you count "sales"? Do these include downloads where a download may account for no systems installed, a single temporary test system or dozens or even thousands of production installs which could be upgrades or new machines. In fact, when you get to open source systems, I don't think there is any methodology that can be any more than an educated guess as to how many systems are in use or have been put into use in any specific time frame.

Perhaps the simplest way to measure server operating system market share, is to count the number of machines on which a server OS is installed, regardless of the power or value of the systems in question. By this measure I would expect Windows to be the leading server OS today. Enough operating systems, including most UNIX variants can be used as either a server or desktop so you need to know how a machine is actually being used to know how to count it, whereas with Windows, presumably any product labeled a server is being used as a server, though I know of at least one developer who uses Windows NT 2000 Server as a desktop system. Another possible way of counting is the total installed value of the systems, which raises the question do you count purchase price, depreciated value or estimated current market value. By this kind of measure I wouldn't even guess which system is on top but one also has to consider whether or not to count mainframes and super computers. One might also count total CPU's or processing power as measured by MIPS or some other scale.

Whatever estimates of server operating system deployments might be made, several generalizations seem likely to be true. Larger companies normally have a greater diversity of computing platforms, especially those formed at least partially by mergers and acquisitions. As companies get smaller the platform diversity decreases and is increasingly likely to be dominated by Windows products. In the server area, though smaller companies still have a mix of UNIX, Novell and in some cases older proprietary midrange hardware, there is clearer shift towards Windows server dominance in small companies as opposed to large. Except for a few technology companies, generally the smaller a company becomes, the less likely it is to have Linux and certainly OpenBSD experience or installed servers.

The availability of critical applications on Windows NT and 2000 servers and not on competing platforms may result in the rational selection of Windows servers in any environment. In all areas important to server selection, except application availability, Linux is a clearly superior server platform compared to Windows NT and 2000. OpenBSD is weak on application support and scalability issues, but in other areas a significantly superior server platform to Windows NT and 2000 and excels in security related areas. OpenBSD has adequate application support in some key areas, and Linux has very good application support in most areas. If my relative assessments of the different systems are correct, why are not more of the open source systems being selected? A rational comparison of strengths and weaknesses should result in higher selection rate in those areas where Linux and OpenBSD have appropriate application support.

To the extent that what others are doing is relevant to your selection, the indications are that Linux is the fastest growing server platform. I could quote a "study" that shows Linux growing at twice Windows rate but I don't really believe those numbers, not because they seem unreasonable but because there is no reliable methodology to establish such numbers and there was no hint of the measures used let alone the methodology. I was trying to find a way to find how many web pages discuss Windows versus Linux as server operating systems. I couldn't find one but interestingly single word Google searches (August 2001) for Linux shows 34.2 million pages, Windows shows 32.9 million and Microsoft 20.9 million. Does this have any significance? I suspect that it does, but knowing what, might require the ability to know the future.

One factor hindering Linux adoption, is a somewhat rational fear of the unknown. I believe it's fair to say most IT professionals still do not have hands on Linux experience and certainly most do not have OpenBSD experience. Many have never heard of OpenBSD. The situation with Linux is changing, though much of the experience is not professional. Most college graduates in computer fields have at least some Linux experience from school computer labs, and a growing number of professionals have home systems. Generally the higher one is in terms of decision making authority, the less likely it is that they will have any practical Linux experience. The firms that have the most to gain from Linux, small (but not tiny) to mid size are less likely to have staff with professional Linux experience. Here I'm thinking firms with IT staff sizes ranging from perhaps four to a hundred. The actual numbers are arbitrary, but clearly smaller than some minimum, given staff with typical talents, diversity of operating systems to support clearly takes a toll. In very large organizations, it's very likely that some staff will posses appropriate Linux experience, even if it has been gained at home, and will try to find a justification to install one or more Linux servers, in a production capacity.

Few IT professionals are willing to make major decisions based entirely on opinions of others without supporting personal experience. We've all read outstanding product reviews and sometime latter found our opinion to be very different than the reviewer's opinion. Anyone with no open source experience and whose never experienced serious Windows problems comparable to those I describe, is not likely to be convinced by my arguments.

Application Choice Over Valued

Then there is the matter of what factors are most highly valued in the computer industry. I discussed a number of factors, all of which get varying degrees of coverage in the press. There is one factor that is valued so much more than any other factor, actually more than all the other factors combined and by a wide margin, but is rarely explicitly acknowledged as an overriding selection criteria. This is functionality, or in the case of operating systems, application support and availability. Purchasers will discard security (if they even considered it in the first place), accept significant reliability penalties, accept performance penalties in the form of paying for faster CPU's, accelerator cards and lots of RAM, so that their computers will run any application they want, including applications not yet created. When the trade press reviews operating systems, they may not even mention the issue of application availability but it's generally known that Windows enjoys a large lead in this regard and it true for both server and as desktop applications.

For home computers, it's hard to argue with this. A single computer is likely to be the only or at least the primary computer for two to six years and there is probably no way to predict how interests may develop over such a period. Until very recently, with the advent of full time, high speed Internet connections, security simply was not a practical issue on home computers. Even today only a small percentage (those with high speed connections) of home computers are seriously impacted by significant security concerns. Even the significant ease of use and setup advantages, that Macintoshes have had for much of their history, could not compete historically, with the broader choice of DOS then Windows applications.

Business desktops occupy a middle ground between home systems and servers. They will be discussed elsewhere.

Though the server operating systems discussed here are general purpose operating systems, the actual server systems that run on them rarely are. It's commonplace for a server to be purchased for the purpose of running a single application or application suite. The vertical market applications that provide core business functions, often run on dedicated systems. Databases, e-mail servers and or gateways, firewalls and other Internet border technology, file and print servers, web servers, and a wide variety of other server applications may be purchased for single purpose functions. Often supplementary management or utility functions will be added to such servers. Additional unrelated applications may be added where a server appears to have excess capacity. Still, the large majority of servers are purchased to perform a small number of functions.

Most computer and communications devices called appliances, are really servers, stripped of unneeded functions, often hardened to some degree, and limited to a single purpose, or closely related group of purposes, and given an easy to use management interface. There is a natural tension between the keep it simple philosophy of an appliance, and the broader goals of a general purpose computer, used as a specialized server, but retaining the option to perform additional functions.

There is no right or wrong answer regarding whether it's best to serve a variety of functions from a single powerful machine, or use multiple single function machines matched to their tasks. The more focused each machine is, the simpler it will be to configure, secure, and evaluate performance issues, including upgrading if necessary. Simpler machines are more modular, and it may be easier to introduce a new service, without disrupting existing functions. On the other hand, separate machines will typically require a greater effort to integrate, and generally increase management burdens, especially with routine issues like backups and log management. Separate machines make less efficient use of resources, as each needs enough capacity to handle its maximum loads, but this excess capacity is wasted at other times.

Regardless of the balance struck between simple limited function servers and powerful multifunction servers, all unneeded and unused services present on any server, represent a liability and a security threat to that server. Thus, even if a powerful multiprocessor server is specifically set up to run a dozen distinct server applications, every service and application present on that machine, that is not part of or necessary to the functioning of those dozen applications should be disabled and removed. It is not an advantage to have an operating system that offers more services that need to be removed and disabled, especially if they are not reasonably documented and difficult to locate as they are on Windows systems.

When purchasing a server, all known needs should be considered as well as any capabilities that are currently under serious consideration. Placing a high value on providing unneeded functions, i.e., the ability to run as yet unidentified applications, is a fundamental mistake in selecting servers. Except in the very smallest organizations that will never have more than two or three servers, it makes sense to buy a new dedicated server for those rare occasions, that some new product, which is not available on the company's primary server platform, quickly and unexpectedly becomes important to the company. Any product that can't justify the acquisition and support costs of an additional server, is not likely to be of strategic importance to the company. Routinely spending unnecessarily large sums on current servers because they might be able to support some as yet unidentified need, where less expensive servers can meet all anticipated needs, including growth, is high-stakes, low-odds gambling.

Introducing Linux to a Windows Environment

If, as I believe, Linux generally and OpenBSD in some specific situations, represent a significantly better value proposition as a server operating system than Windows NT and 2000 Servers, how can Windows dominated organizations begin to take advantage of the Linux benefits? I don't ask how to switch from Windows to Linux, because very few organizations that are currently dominated by Windows, will be able to or even wish to consider switching in the near future.

An organization that uses mostly Windows operating systems, with a variety of loosely integrated third party products, will have a relatively easy time moving away from Windows. I said "relatively easy" as substantially reducing a Windows dependency is a major task. The more Microsoft products in active use, the harder any significant move away from Windows products becomes. This is true to some extent for any platform, but none so much as Windows. Any company that already has a significant investment in custom applications, built on Microsoft products, with Microsoft development tools, is not likely to be able to or desire to significantly reverse this direction in the near future. For such companies, I'd advise watching for any early but clear indications of a significant downturn in Microsoft fortunes and applying the advice that follows to the extent it may be practical.

Practical Linux or OpenBSD experience is a prerequisite to introducing it into a business environment. Without at least one IT staff or manager, with hands on Linux experience, actively championing it as a good or possibly the best or even the only solution for a project, it's hard to see an all Windows shop embarking on a Linux project. If there are already one or more UNIX machines in the environment, the hurdle is much smaller, as Linux may then be seen as a low cost supplement to these other UNIX machines.

Gaining Linux Experience

Today, any forward looking computer professional, can see that open source and Linux are and will continue to play a growing role in the computer industry. There is lots of room for disagreement over how much and how soon but it's certain its installed base will continue to grow for the foreseeable future, and the indications are that its market share is increasing and will continue to increase in both server and desktop markets. Until your company starts to use Linux, the best way to increase your Linux (or OpenBSD) knowledge and skills is to build and use a Linux machine at home. Perhaps the most common solution to this is to build a dual boot machine, but this is a poor solution, when once powerful machines are being discarded as inadequate to run Windows.

Generally people want to avoid the costs and space of a second PC. The problem is that after you prove that you can build the second or third OS and fiddle with it a little bit, it will be put aside and forgotten. This assumes that you didn't wipe out your primary machine in the process. Home machines and even office desktops are rarely properly backed up. With Windows, unless you have a tape drive, or a CD-R and software that lets you make bootable recovery CDs, there are no reliable ways to get a backup that insures you can restore a system to its previous state. I cannot imagine dealing with the grief of losing days or even months of work, and even if all user data is saved, rebuilding a Windows machine that's evolved for a year or more is almost unthinkable. It's been several years since I last had a hardware disk failure, but creating dual boot systems, is still one way to put all data on the disk at risk.

On a dual boot system, assuming there are no serious problems and both systems work, they cannot communicate. Data exchange will depend on removable media or be limited to Linux reading and writing the FAT disks unless you install Linux into FAT partitions (no thank you). Rebooting is a pain; regularly rebooting to switch back and forth between systems even more so.

A much better solution is to use an older PC, that otherwise might be discarded. If you don't have an older machine at home, any machine that is being discarded at work, is probably adequate for Linux. Ask for one; any licensing or data confidentiality issues can be dealt with by wiping the hard disk before the machine is given to you. Linux just doesn't require the resources that Windows machines do, especially if you don't plan to use the X Window system or its use will be minimal. Two monitor and keyboard setups are a real nuisance. Good, four port keyboard, video, and mouse (KVM) sharing switches can be bought for under $100 and two sets of cables for less than $20 more. You sit at one comfortable location and with a few keystrokes, switch between two or more machines.

You don't want to worry about getting your first Linux install right. It's great if you do but it's best not to get attached to it. It really helps to learn a new system if you install it multiple times, and try some of the different options each time. With an old system, you can reinstall and repartition as much as you want, without the slightest worry regarding your primary system. You can start to get a real understanding of partitioning options and UNIX filesystems, which you are unlikely to have experienced in a college. Repeatedly installing into a dual boot configuration is not a process I'd recommend.

Perhaps the best thing with two systems, even though this may seem a little odd at first on the same keyboard and monitor, is that you can network them. In most work environments, Linux systems will co-exist with Windows systems, and learning to network them will be valuable experience, whether you're a software developer or system administrator. With two live systems that communicate, you can begin to simulate and experiment with all kinds of things that are totally impossible on a dual boot system. With only two systems, assuming both have Ethernet cards, only a $5 crossover cable is required for networking.

You'll spend much more time on the Linux system, if it's only a few keystrokes to check your e-mail on the Windows machine, and do the other routine things you're so used to doing on the Windows machine. You could check e-mail on Linux, but splitting e-mail from a single POP3 account, across multiple machines creates a number of problems. Your primary desktop system will have your most frequently used applications and their data. If using another operating system makes these inaccessible, there will be strong pressure not to use that other operating system. By it's very nature, a dual boot system makes whatever system is not currently booted, inaccessible.

Another possibility, if you can't or won't go the two machine route is to get a disk tray for your PC, that turns your hard drive into a removable drive. Then you get a second hard disk (they are so cheap now) physically similar to your original, and install Linux on it. This eliminates the worries about damaging your primary system. This is a good solution for three or more systems, as the more systems you try to multi boot, the more problematic the results, and the greater the likelihood of losing data. With two, removable drive systems, you could do networking of any combination of systems that you wished.

If you already have two or more PCs, and have or are considering a cable modem or DSL, a good project is to build a custom firewall on either Linux or OpenBSD. Linux is obviously a more marketable skill, but if your interests are very much in security, this is OpenBSD's strength. OpenBSD firewalls in environments that otherwise use different systems are a real possibility. Almost any 486 or better that you can get network cards for will do. Your Internet provider may already provide a firewall, and a growing number of good firewall appliances aimed at home and very small businesses are becoming available, but you won't learn anything using these devices. You may have to trust a configuration set up by someone else. I learned considerably more about networking with TCP/IP in the two months that I focused on firewalls than in the preceding 8 years of various Windows and UNIX networking and web work I'd done. I went well beyond the basics of getting a functional firewall but that was the point of the project.

Yet another way to really learn Linux, is to start with a new high end PC, and install Linux as the primary system. Then get VMware and install Windows 2000 Workstation or NT or ME or whatever would normally be your primary desktop, inside a VMware partition. Then migrate your desktop applications to the new PC. This would give you Linux all the time, with your Windows system available as much as you need it. Over time you could phase out any Windows products, for which better ones became available on Linux. If you planned to stay in the Windows world, you could keep your Windows applications up to date, and get the best of both worlds.

The drawback of this system is the cost it could incur due to licensing issues. If the OS and most of the applications on your current Windows PC came with the PC, or are upgrades from products that came with the PC, the licenses typically won't allow what was described. To be legal, you'd need to buy standard and not upgrade, retail versions of the OS and applications that came with the older PC.

A KVM sharing box is much cheaper as the software costs could easily be more than the cost of a new PC. The KVM allows the existing Windows machine to be used as long as it's useful, and eventually to fade into complete disuse as it ages. The KVM also allows full networking between the systems. If Windows is and will be your primary environment, and Linux is just a learning experiment, then clearly you'll want keep the Windows machine the faster of the two. If you're not close to a Windows machine upgrade, and have no spare old machine to test with, a new low end to mid range desktop, along with KVM will make an ideal solution if you can afford it.

First Linux Project

I see a few good scenarios for introducing a first Linux machine. For any Internet connected company that still lacks a firewall, setting up an OpenBSD or Linux firewall could be an excellent first project. If however this were done poorly, and blocked services users expect, without planning, explicit management approval, and forewarning to users, this could do more harm than good.

Another potential Linux project is any situation where a user department wants some new capability which would require a server and application software for which there is no budget. If there is an open source product that is a good fit to the department's needs, and IT sees this as a valuable project likely to have visible benefits to the company, this could make an ideal demonstration project.

I can't provide examples that fit this description because they depend on detailed knowledge of a company and project. IT staff or managers with Linux familiarity who hear user requests, might do informal Internet research to see if there were suitable open source products. In preliminary discussions with users, be very clear that no commitments are being made. Be sure that you have a good understanding of both user needs and product capabilities, and keep in mind user's "needs" have strong tendency to grow once a project is started. Never tell a user you have an easy, quick or cheap solution to their problem unless you are really sure you do.

The kinds of products I'm most familiar with tend to be infrastructure such as web and e-mail servers and older established technologies. As most companies now have web sites, this would not be a new project. There would need to be real technical problems with the existing site but if so, there are few better ways to use Linux, than to replace IIS with Apache. If ASP is widely used then porting becomes a major issue. If Perl is widely used, but it makes extensive use of Window's API's or other Windows specific features, porting could be a significant issue. Use of FrontPage introduces significant issues. If Index Server is set up well and works, its functions are hard to match with available free products. If staff is used to updating the site via drive sharing, this can be done on Linux via Samba, but is not an optimal solution for security reasons. Even if all the technical issues, favored a move, web sites are now typically highly visible and any change is likely to have significant political visibility. If an existing web site's problems were content related and not platform related, then changing platforms would not help, and could exacerbate the existing problems.

At first glance a list server or other mass e-mail tool for a marketing department might look like a good open source project as good list server software runs several thousand dollars in addition to OS costs and there are multiple open source list servers. The top commercial list servers run on Linux as well. In their traditional form list servers are purely standalone applications. Bulk e-mail loses much of it's value, unless results are kept in the same customer database as purchase history and other pertinent data. Depending on the software used, bidirectional data updates or routine large scale exports and imports are likely to be needed. The customer database may need new fields and maintenance capabilities to track customer e-mail preferences. Correcting invalid and changing e-mail addresses can quickly become a major task.

Any project that requires regular access to, and especially update of, existing data should be treated as an extension of the systems that uses the data. There may still be a role for a Linux server providing supporting services to some existing applications, but the main point of these suggested demonstration projects is to show real maintenance costs for the Linux OS, are considerably less than for Windows Server OSs.

If a project results in IT staff performing frequent unexpected application related functions, it will say nothing about either OS, but will obscure the results. If Linux is going to support existing applications, be sure that all tie-ins are fully automated or performed by system users and not IT staff, and that the programming and configuration necessary to accomplish this is included in project estimates. On any non trivial project, the value of staff time will likely exceed initial hardware and software costs (must exceed if an open source OS and applications are used on older and otherwise unused hardware).

Failure to consider all the ramifications of a project, may easily lead to a sense that a project may be done very cheaply using open source, as no additional hardware or software costs may be involved, but lead to significant surprises in development or ongoing staff costs. This is in no way specific to Linux. It's just a general acknowledgment that some apparently simple projects, can have large unforeseen costs. This is a very real possibility any time a new application uses existing data, especially if the results are in any way visible to your customers.

If staff need remote e-mail access, Linux is an ideal platform for POP3 or IMAP servers. Some other first projects for Linux are less visible. Even if Exchange is used as the e-mail system, for security reasons, it should not be exposed to direct Internet access. Either Sendmail, or preferably Postfix, can be set up as an intermediate mail transfer agent, between Exchange Server and the Internet. One such server might front for multiple Exchange servers. Squid could be used as a web browsing, proxy server, improving internal web access, and reducing the load on Internet connection. Other good first uses for Linux are DNS, DHCP, and Public FTP servers.

In some ways, the low profile nature of these just mentioned projects, have advantages as a demonstration project. Much of the point is to show that for many functions, once a Linux (or OpenBSD) machine is setup, it may largely be forgotten until a change or enhancement needs to be made. Other than the routine change of backup media, several of these suggested Linux uses really should be maintenance free. Besides backup, all computers have some user and disk maintenance issues but these should be negligible on these behind the scenes uses. After such a Linux machine has been up for weeks or months without a problem or reboot, Windows administrators should begin to appreciate the differences. Compare this experience to your periodic and or maintenance forced Windows reboots, server crashes and unexplained problems. After the hurdle of the first Linux machine is passed, the next time someone says "We need another server for . . ." consider both the Windows and Linux choices.


I've always found a job to be more satisfying and less pressured, when doing something new and moving forward, whether it was a significant new software development project or simply automating a routine process through cron, than fixing systems that had been working and now are not. My Windows NT Workstation experiences from early 1996, have for the most part been positive, because it's been a far more stable desktop system than any I'd worked with before.

My Windows server experiences were never better than mixed as by 1997, when I seriously started working with NT Server, I already had a few years of AIX and Sun experience, so from the beginning, Windows servers were somewhat of a step backwards. Over the next four years, as I worked with a growing variety of NT servers, performing different functions, and spent growing amounts of time troubleshooting a variety of unexplained and sometimes inexcusable problems, my opinion of NT as a server system became more negative.

Once I had an opportunity to extensively compare NT to Linux and OpenBSD servers, doing or attempting to do essentially similar tasks, the issue was settled in my mind. Windows NT Server was clearly an inferior server platform choice compared to either, despite its application choice advantages. As I wrote this comparison, I expected to keep my public NT web site mirror for some time. When it self-destructed on August 19, 2001, due to a bad memory chip, I decided there was no point in investing more time restoring a system that was part of my professional history but will not be part of my professional future. When I have the time, my former Windows NT web server, will be replaced by FreeBSD or a Linux variant as a web server.

Regarding Windows 2000 Server, it's obviously an improvement over NT Server, in most regards except certain price and licensing issues, but it is simply too little, too late for way too much. Nothing can convince me the modest (significant to Windows only users) improvements bring it anywhere near to Linux or OpenBSD as a server platform. Why pay serious money for a clearly inferior product when much better is available for free or negligible costs? Except for those situations where constraints force the selection of Windows 2000 server, I can see no reason why any IT purchaser, knowledgeable of the alternatives available, including high end commercial UNIX systems, would ever willingly choose a Windows 2000 Server.

transparent spacer

Top of Page - Site Map

Copyright © 2000 - 2014 by George Shaffer. This material may be distributed only subject to the terms and conditions set forth in (or These terms are subject to change. Distribution is subject to the current terms, or at the choice of the distributor, those in an earlier, digitally signed electronic copy of (or cgi-bin/ from the time of the distribution. Distribution of substantively modified versions of GeodSoft content is prohibited without the explicit written permission of George Shaffer. Distribution of the work or derivatives of the work, in whole or in part, for commercial purposes is prohibited unless prior written permission is obtained from George Shaffer. Distribution in accordance with these terms, for unrestricted and uncompensated public access, non profit, or internal company use is allowed.


What's New
Email address

Copyright © 2000-2014, George Shaffer. Terms and Conditions of Use.