There's an old saying: those who refuse to study history, are doomed to repeat it - and, in the IT version, this says that people who base IT decisions on popularity are doomed to forever accept recycled ideas as new and advanced.
With that in mind I want to dig out something from last January discussing a plan by one Mark Whitehorn to blog his company's transition to the wonders of the 1980s coded into Microsoft's Server 2008 stack:
About Windows Server 2008
There's an old saying about the devil making work for idle hands - so there I was last week, feeling a little bored and idly clicking through a story both about and by some guy named Whitehorn who's going to blog an extensive development project based on the forthcoming Microsoft server stack. The opening paragraphs in the write up make sense and it seemed like an interesting project to follow - at least until I got to the self congratulatory rationale for leading this organization down the garden path:
If an application runs fine on version eight of your preferred RDBMS, why would you even consider upgrading to version nine? The best you can possibly hope for is that it continues to run fine (no gain there then); the worst is crash, burn and time to find a new job elsewhere. This is why we have production systems still running in COBOL.Our current database is running on SQL Server 2005, sitting on Windows Server 2003. We are also using Analysis Server 2005 to produce the OLAP cubes and ProClarity for the data visualization. It works. So why am I sitting here, two months shy of the launch of SQL Server 2008, Windows Server 2008 and Visual Studio 2008, desperately seeking servers that will run this entire 2008 stack and porting an application that will go live (for a subset of the users) long before the summer?
Masochism and/or bravado spring to mind, but it isn't those. Killer features continue to appear in software that confer such competitive advantage that procrastination is impossible. Of course, it's only a killer feature if you need it and our application really, really needs it.
The "it" in question here is the ability to handle spatial data. SQL Server 2008 has it. End of story. Yes, I know Oracle had spatial data types first, but are there other, associated, reasons such as the business intelligence capabilities that made SQL Server 2008 the obvious choice for us.
OK, but why Windows and Visual Studio 2008 as well? Don't think we didn't think long and hard about this. But, while the decision adds to the workload now, it should reduce it in the future. And if we don't, we will spend the next two years sailing the sea of uncertain upgrades, dreading the support calls that start with "What OS are you running that on? Ah, well, if only you were running on..."
Having made the decision for all the right reasons, it would be disingenuous to pretend that we aren't looking forward to the challenge. I believe the conservatism discussed above is essentially forced upon DBAs by commercial considerations. In truth if we aren't excited by challenges, if we don't like problem solving, what are we doing working in computing?
One gets the impression that Mr. Whitehorn might be a bit of a one trick pony - someone whose Microsoft related skills outweigh his responsibilities to his employer - if only because he doesn't seem to know that the facilities he needs were available on Ingres for BSD in 1981 and, in more advanced form, via Postgres on Linux or Solaris now.
SQL-Server, of course, isn't the only component of this stack to rediscover things that have been more or less standard on Unix for decades. A couple of hours wasted reading excited reviews by Microsoft "enthusiasts" hyping this stuff suggests first that all the Longhorn ideas are completely gone - and secondly that there's nothing new to Microsoft's Server 2008 OS that doesn't come from Unix or IBM's even older VM.
I've been following his reports on the project - there are seven, numbered one through eight (sic) - including his rather pro forma June 2nd success announcement.
Aside from installation hassles ( "In three days I went back to bare metal five times before everything was present and correct") I saw nothing to suggest that anything he did couldn't have been done about equally well, albeit with much more difficulty and cost, with Unify 4.0 on NCR Unix in 1988, with Informix on SunOS in 1990, or with Illustra on Solaris in 1995.
There must have been something - certainly the hardware and networking change over the period has been dramatic. I have notes from a 1988 project review suggesting that a firm in Fort Collins had 12,000 production 160MB disks hooked up to BSD Vaxen using Unitree and now you can pretty much achieve comparable storage and performance on a PC.
More to the point, Whitehorn doesn't detail his project much beyond bragging about the 1TB database size and his use of Microsoft's toolset - but as far as I know those tools don't incorporate anything that's really new, with most of the improvements since the 2003 server stack being unabashed imports from earlier Unix and open source products.
So if you ask what twenty five years of relentless technical advance at Microsoft has brought the customer the answer is probably just packaging. Basically, if there's home grown Microsoft technology in the 2008 server stack I don't see it - but what I do see is powerful ideas made easily accessible to people with limited technical skills.
Look carefully at what Whitehorn writes and that's what you'll see: old ideas in new packaging with the validation and success enjoyed by Microsoft's customer base coming far more from the packaging and marketing than the implementation.
In many ways I think that's actually pretty cool and defines, I think, the fundamental Microsoft value added: the one thing justifying the economic cost of keeping the Microsoft newseum in business - but also, unfortunately, a probable root cause of the visceral rejection of all things Microsoft by those who do have the technical skills and historical perspective needed to evaluate it.