The Unity of Unix (2005)

Although many people claim that Linux is well on its way to replacing Unix the reality is that Linux is Unix: a particular stream within a much wider community whose traditions and ideas both surround and extend those found in the Linux group.

Look carefully at the history of Unix and what you see is a core set of ideas worked out in many different ways by many different people. Unix detractors invariably talk about the fracturing of Unix and refer disparagingly to what they call the "Unix wars" of the nineteen eighties but in reality even the most widely divergent Unix products generally differed in implementation detail and hardware support, not in concept.

In this context it's important to remember where Unix and open source came from: they're implementations of core academic traditions in the development of community and the publication of results.

The computer science side of the turf wars that took place at MIT during the early sixties run up to the Multics development decision wanted the system to act as a focal point for the development of a community of users with openly published source code. Here, for example, is part of what key Multics designers Corbato and Vyssotsky wrote in 1965:

 

It is expected that the Multics system will be published when it is operating substantially. ... Such publication is desirable for two reasons: First, the system should withstand public scrutiny and criticism volunteered by interested readers; second, in an age of increasing complexity, it is an obligation to present and future system designers to make the inner operating system as lucid as possible so as to reveal the basic system issues. ... The present plans for the Multics system are not unattainable. However, it is presumptuous to think that the initial system can successfully meet all the requirements that have been set. The system will evolve under the influence of the users and their activities for a long time and in directions which are hard to predict at this time. Experience indicates that the availability of on line terminals drastically changes user habits and these changes in turn suggest changes and additions to the system itself. It is expected that most of the system additions will come from the users themselves and the system will eventually become the repository of the procedure and data knowledge of the community.

From the academic perspective a time sharing computer system looked like a communications nexus tying together a community of users - but that wasn't how the other side, comprised mainly of IBM's data processing professionals, saw it. To them secrecy is essential and a computer is an electronic clerk used to replace people, not to extend their abilities.

In the end the science side won the funding and design battles, but development was handed over to the data processing professionals. What they produced, of course, was a system (and a development project) cast more in their own image than that of the designers. Thus it took Thompson's single minded rebellion against the excesses and mindset of the Multics project to produce Unix - really the product Multics had been intended to be.

Today the best known Unix variants are Linux, the BSDs, and Solaris. Of these, Linux is fundamentally still what it set out to be: Freax (pronounced freeux), a free unix for the 386, the BSDs continue the research heritage while powering four million new Macs a year, and Solaris is leading the migration to next generation Unix by implementing true network computing on the Plan9 model.

It's because of this commonality of origin and purpose that the overwhelming majority of what people need to know to make effective use of Unix is independent of the product label. Core processes, from access to development libraries to the fundamentals of day to day operation are functionally the same across all major variants and so are hundreds of GNU utilities and thousands of open source applications. From both user and sysadmin perspectives Perl is Perl, PostGres is PostGres, and SAMBA doesn't change in any significant way whether the host runs Linux, netBSD, or Solaris.

One of the areas in which this has consequences is systems hiring. It's quite true that someone's hands on experience with one of the dead or dying Unix variants won't apply directly to Linux, BSD, or Solaris. On the other hand someone who knows how to use HP's ISL configs or how to make raw devices under AIX usually also knows when and why to do these things -and that's what's important, any specifics needed are available in the on-line manuals.

That doesn't mean that Red Hat certification qualifies an applicant to debug Oracle on a 72 processor Solaris machine, there are differences both in the details and in the tools available. What it does mean is that the Red Hat guy's ramp up to Solaris competence is very small compared to the hurdles faced by a competitor whose experience is encapsulated by an MCSE designation.

People who categorize the Unix market as splintered or fractured are generally trying to compare it unfavorably to Microsoft's Windows. That's simply wrong: Windows is a brand, Unix a set of ideas. The Windows brand has been consistently handled, but there's essentially no continuity of ideas between the 3.0, 95, NT, and Longhorn Windows generations. The Unix hardware makers, in contrast, have tried hard to differentiate their products through branding when, in reality, all of their products have been part of the same family.

Oddly enough, therefore, both beliefs: that Microsoft has been consistent and that Unix hasn't, are consequences of marketing fictions.

In Microsoft's case that marketing fiction has required some pretence to backward compatibility - with the odd result that today's sixty million line desktops will run some ten year old binaries, but not allow code written for previous Windows generations to compile.

Unix doesn't have Microsoft's surface consistency, but theory drives change to build a record of continuity as ideas are tested, accepted, and implemented. As a result the examples in Kernighan and Ritchie's 1978 The C Programming Language work today, Kernighan and Pike's 1984 The Unix Programming Environment applies about equally well to Linux, netBSD, and Solaris, and binaries made for the first 64bit UltraSPARCS ten years ago will run, unchanged, on Sun's next generation Niagara hardware.

I'm writing this on the fourth of July and the phrase "e pluribus unum" (out of many, one) comes to mind. Look at Unix as it is today, as it was ten years ago, or as it started in the seventies and that's what you see: out of many one: many developers, many agendas, many skillsets, many variants, one continuously developing and expanding set of ideas.

That reality has implications for everyone involved. Competition has always been part of the game, but whether you prefer Linux, BSD, or Solaris is fundamentally immaterial when the choice is Unix or Windows - helping an employer choose to install and use any Unix variant grows everybody's market. Remember, there are bad guys - companies that value money over progress- in the race, but if you work with Red Hat the enemy isn't Sun; and if you work with OS X the enemy isn't Linux: for any Unix the enemy is Microsoft.

There's a simple, personal, bottom line to this: Unix is Unix, the benefits come from openness, community, and fifty years of consistent progress in the implementation of a handful of key ideas. In that context whether your skills come from working with brand A or brand B really doesn't matter: you use what works. If you're most comfortable with Red Hat but your employer's needs whisper "Solaris" or "Darwin", grab the opportunity to learn a bit more -it'll do you, and your employer, nothing but good. Remember, a rising tide lifts all boats: the more Macs and Sun machines get installed, the more value your Red Hat certification will really have.

 

 


Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer