Servers using Intel's latest processors and the motherboards that go with them are among the most efficient, in terms of work done per unit of power consumed at full utilization, yet produced by the x86 industry.
Unfortunately a lot of that efficiency results from scale, and thus from workload consolidation and high system utilization - to quote Sun's bmseer:
SPECpower_ssj results shows that servers (even those with the industry's best power-management) running at low-utilization levels use many times more watts per unit-of-work than systems running at higher utilization levels. Datacenters can realize the biggest energy savings by running fewer servers at higher utilization levels (50% utilization or above).Sun's results on the 8GB (or 0.5GB/core) configuration show that running at 10% utilization requires 4.4 times more power per unit of work than running at 50% utilization.
4.4 times = (581 performance-to-power @ 50% utilization /133 performance-to-power @ 10% utilization)Most SPECpower_ssj2008 are published on small-memory configurations that are much smaller than typical customer deployments. Sun is the only vendor to publish multiple results to clearly show effect of memory configuration.
A more normal-sized memory configuration of 32GB (or 2GB/core) uses 30% more watts than a tiny 8GB (or 0.5GB/core) configuration at 100% load. At active-idle the wattage difference is also 30%. Some competitors use additional configuration differences such as non-redundant fans, non-redundant power supplies, and single slow disk to further reduce the wattages and significantly improve SPECpower_ssj scores.
He's complaining about competitors misleading customers, but the more important thing here is the extent to which utilization rates affect power efficiency - that 32GB, 16 core, Netra 4250 burns 225 watts doing nothing but making heat you also have to pay to remove -and that minimum only goes up by about 71 watts (31%) as you add workload until you max out the machine.
Now given the weathermen's commitment to more than doubling American power costs over the next eighteen months, the obvious conclusions you should be reaching from these numbers include:
Since these directions are consistent with what a large part of the industry wants to do for other reasons (mostly having to do with the consolidation of a different kind of power) the new energy taxes will, I expect, simply but radically accellerate an existing trend.
There are, however, several big negatives to doing this:
Saving power dollars at the expense of business flexibility may, in other words, illustrate the adage about being penny wise and pound foolish.
In other words, it's not wrong to reduce server room power use, but the big opportunities for savings aren't in the data center, they're on the desktops - in fact if data center management measured system wide CPU utilization they'd quickly realize that every successful move to increase data center utilization through consolidatation actually reduces system wide processor utilization.
Two things seem to happen:
Thus what happens is that consolidation reduces the rate of externally driven systems change in organizational IT services without reducing the external presure for change. As a result business management works to find ways around real or perceived IT roadblocks and eventually, of course, IT management has to respond to the presure by doing something -but then discovers that there's nobody left on the bench to do it, and, worse, that the people kept on for their fine work in the data center don't speak user.
So is there a better idea? Something that responds effectively to exploding power costs while not incurring these negatives?
There is: and it's the opposite of what most people will be doing: instead of consolidating everything to the data center, get rid of the data center. Move your processors and IT staff into user spaces, replace as many 80+ watt desktop computers with 4 watt desktop displays as you can, and use Unix, not ghosting, to run as many applications as you need on each box.
The downside to this approach is that it's hard to carry out and unpopular with your peers, but the upside bottom line is that there are huge advantages for your employer in doing it - starting, not with the cost savings you'll achieve, but with having IT actually work with and serve the people who make the money, not the people who spend it.