Today I read an interesting article “The Efficiency Paradox” in latest Business Week magazine. It reviews the book Conundrum: How Scientific Innovation, Increased Efficiency, and Good Intention Can Make Our Energy and Climate Problem Worse by David Owen. I haven’t read the book but got the main idea of the book from the article.
The book author states that “most supposedly sustainable products and eco-living strategies are irrelevant or make the real problem worse.” According to the article, his logic is “backed up by an economic principle known as ‘rebound effect’: advances in energy efficiency lower the cost of a given activity, which causes people to engage in that activity more, cancelling not only savings but also environmental benefits.” For example, a 1940 aluminum beer can weights “five times more than today’s can of Bud Light.” “The cost of popping a brew declined so that more people can do it, using up more aluminum, not less.”
This made me think about virtualization, which has been long considered as a way to green IT because it consolidates physical servers therefore uses less energy. Will “rebound effect” apply in virtualization? In other words, will people actually use more energy because of virtualization?
For one thing, the physical to virtual consolidation truly saves energy or there isn’t much enthusiastism about virtualization as top IT priority in existing data centers. Going beyond legacy data center, the number of the servers probably won’t decline much as new cloud data centers to be built. As I remember, a research report predicts the server shipment will be flat or less in the next few years. The energy consumption is not simply proportional to the server number. It’s also proportional to per server consumption. I think the servers running virtualization are likely more powerful and cost more to build and run than its predecessors without virtualization. Combined together, I think immediate saving should be there, at least not growing as much as before.
The real key is actually long term. I think it’s when the rebound effect manifests itself in most cases. For virtualization, because a virtual machine is so easy to create and supposedly low cost to run, we’ll end up with more virtual machines than before. Also, the virtual appliances will also lead to a flat world with many more virtual machines.
Moreover, the virtual machines are mostly likely always powered on because of perception of low energy consumption. Hopefully a license cost associated with running virtual machines makes customers to stricken policies to shut down unused virtual machines. However not every vendor has such a license. Then the physical capacity of a server will be the ultimate limit. Even a virtual machine is powered off, it’s still in storage, the larger of which, the more energy is used.
Will these potential increases offset the initial saving? There is definitely a possibility. Almost every rule has exceptions. I, and I think you too, hope virtualization will be an exception to the rebound effect.