If you and I are running comparable organizations and we hire interchangeable IT people to do the same things with the same gear and the same software in the same ways, neither one of us is going to get any competitive advantage from our IT commitments. Notice that the issue here would seem to be parity, not technology: in other words it seems self evident that sameness does not differentiate whether that sameness is expressed via Wintel, caged gerbils, an out source IT services supplier, or a Unix/Smart display architecture.
Fortunately it only seems that way. In reality, there's a hidden variable: adoption of a specific technology implies co-adoption of the key management and applications ideas underlying that technology. As a result, industry wide adoption of technologies that foster change drives organizational divergence in usage patterns and thereby creates opportunities for competitive advantage to emerge.
In environments where everyone follows the same leader, you get essentially homogenous change across user organizations; meaning that parity is maintained over time and the axiom that sameness does not differentiate therefore holds. This is what happens, for example, in the Wintel world: there are early adopters and late adopters, but everyone ultimately follows the Microsoft batton - leaving user organizations only the choice of timing as they're forced to follow a single technical direction just to maintain operational parity with all the other followers.
In other environments, however, change is not homogenous across user organizations and parity now implies disparity later. Thus if we were to study a large number of organizations which committed to Windows 2000 several years ago we'd expect to find them all pretty much doing the same things today, but if we followed companies choosing to adopt the same Linux server variant in the early 2000s, we would expect to find significant differences in their behaviors and applications architectures today.
Thus if you assume, as Carr does, that the Wintel PC defines IT, you could reasonably conclude first that there is no positive competitive advantage to be had from IT, and secondly that the scramble to lowest cost of service is the only legitimate basis for IT competition. Look beyond the PC, however, and you'll find both winners and losers among organizations willing to accept the risk of being different.
So if we restrict discussion to the risk takers, what a priori condition most distinguishes winners from losers?
I'm not aware of any good rule for predicting success, but there's one with essentially a 100% track record for predicting failure: organizations which mismatch IT skills and IT technologies waste both skills and technologies. In other words, they fail to realize on the relative competitive advantage their technologies and/or staff skills could have given them.
Put a committed MCSE type in charge of Linux servers and you predictably get unreliable, high cost, high hassle services. This doesn't happen because the technology is bad or the IT guy is incompetent, it happens because they're mismatched. You don't put a world class cyclist in a race car and expect him to win -and you can't blame either the car or the cyclist when he doesn't. Put a data processing professional in charge of Unix and the vast majority of the power of Unix will be destroyed, reliability will suffer, performance will decrease, and systems cost will rise - not because the data processing guy is incompetent, and not because Unix doesn't work, but because none of the things he knows for sure about what the IT job is and how to do it, will match the technology.
Notice too that this applies regardless of the technologies involved: put a Unix guy in charge of centralized Windows or mainframe systems architecture and he'll try to make it do things that are difficult or expensive in that environment while failing to do things that are relatively cheap and effective.
Bottom line: if you want your type X technology to work as well as it possibly can, put a type X person in charge. Do anything else, and both will under-perform - often significantly so.
Unfortunately this rule doesn't cut both ways: getting the match wrong guarantees failure, but putting people with the right backgrounds in charge of systems architectures they understand doesn't guarantee anything.
So suppose a bunch of competing organizations split into two groups that go opposite ways: one group puts some smart MCSEs in charge of Wintel client-server architectures, and the other group hires Unix people to put in and run Unix/Smart display architectures. What happens?
The Wintel users will all follow the Microsoft batton: competing on cost of service and ultimately falling into line with the simple reality that the more control they exert over the user desktop, the lower their total costs. Basically these guys will do more or less lockstep upgrades for their desktops, their skills, their control software, their applications, and their servers - meaning that if you look at them over short periods like a month or a quarter you'll see the lead change as early adopters get some benefits and late adopters get some short term savings, but in the long term they'll all look about the same.
In contrast IT practices among the Unix users will diverge. They may all start out with SAP on Red Hat, but before long some will be customizing Oracle on Solaris, adopting open source applications, swapping in SuSe, or extending what they have with little custom hacks.
There are chicken and egg style interlocking sets of reasons for this behavior to emerge - and I'll write about this over the next two days. The most important, however, is that the reductions in IT stress and workload for both IT managers and users enables a change from IT management to IT leadership.
Understand, in this context, that management is about getting more hands on a job but leadership is about getting more brains focused on a job. Managers have to be there when a job gets done, leaders don't.
With smart displays there is no desktop PC, and therefore no help desk - meaning that application support comes from lead users, not a PC guy. That pushes control out to users, and once they get control they'll push application and service evolution in ways that are unique to their departmental personalities, their ways of doing things, and their ideas about customer services and product delivery. The result is competitive differentiation because no two organizational groups ever stabilize their ideas at exactly the same place.
In other words, early adopters get a relative productivity kick, but later adopters will narrow the competitive focus to user driven change and thereby lead to the emergence of real IT winners - and even the relative losers are likely to beat the Wintel averages.
So what's the bottom line? If your IT management matches your IT technology you'll get relative competitive advantage from letting your users, not your vendors, drive service evolution. You can do this with any technology, but it's consistent with core Unix ideas, and inconsistent with the cost cutting and control centralization agendas forced by Windows desktops - or mainframe costs. Meaning it's fundamentally a technology neutral idea, but doing it with Unix is like running downhill, and doing it with Windows is a lot like slogging up a waterfall.