Is there that much of a difference?
Let's see. One of Liquidweb's data centers can hold 25,000 servers. at $100 savings per server that works out to an annual savings of $2.5M. Over the 5 year life of a server that's $12.5M. Sounds like a lot of money to me.
*If* you can save 50% on the energy costs, you save $100 on energy per server. If you can save 25%, you save $50 only. Still good...
BUT for that, you need to replace all the processors now, presumably ahead of the expected replacement date.
Suppose the cost per processor is $250 (based on the low end prices listed by Intel), replacing the processors requires an upfront investment of $6.25M, plus the cost of physically replacing the processors (replacing and testing 25K units is a lot of work) and the opportunity cost, since your personnel is tied up doing the replacement, and of course the administrative costs. I wouldn't venture a guess on those costs, but I suspect it is not free nor trivial.
If you save $100 savings per server, it will take 3-4 years to break-even, and at $50 per server, you will not break even.
Additionally, the existing servers are due for replacement at a later date. It is reasonable to expect that at that time, technologies will have moved forward and that processors will be even more energy-efficient than they currently are thereby negating the earlier savings, unless you are willing to re-replace the CPUs at shorter intervals.
If you save $12.5M on energy but end up spending $20M on CPUs and related expenses, you have lost $7.5M.
What I am saying is that just because it looks like a great deal doesn't mean it is. Energy consumption should be taken into account, sure, but you have to account for ALL consequences and all expenses involved before you can say that it will save money.