Does Virtualization Increase IT Costs? Depends on Who's Talking.

By Charlie Schluting (Send Email)
Posted Apr 16, 2010


We all know that virtualization doesn't represent some sort of magic bullet that makes all your management concerns disappear, but some claim virtualization can even increase management costs. While it's true that you'll have to change your management focus and do some training to adapt your IT staff, this column argues that higher overall expense doesn't necessarily follow.


We all know that virtualization doesn't represent some sort of magic bullet that makes all your management concerns disappear, but some claim virtualization can even increase management costs.

Recently, Cisco started saying that virtualization doesn't actually save money due to increased management costs involved with running a virtual infrastructure. Sure, in the same glossy Cisco was selling its "unified computing" system and management tools, but it may have had a point. Sometimes it may seem like running a virtualized environment is more work.

It's certainly different, which means there is a one-time learning curve. The ramp-up costs associated with learning management tools for a given virtualization environment are one-time costs, or sunk costs, that wouldn't factor into the calculation of yearly IT management costs.

There are fundamental differences in the way virtualized servers are managed, however. If these differences, compared to running bare-metal servers, prove to add a substantial management overhead, there may be something to this notion. Let's talk about four aspects of managing a virtual environment: deployment, managing changes, monitoring and tuning.


Read "Does Virtualization Increase IT Management Costs?" at Enterprise Networking Planet

Page 1 of 1


Comment and Contribute

Your name/nickname

Your email

(Maximum characters: 1200). You have characters left.