Although the technology required to virtualize data centers has been around for more than 40 years, 2007 will go down in history as the end of the beginning — the year the technology vaulted into mainstream consciousness and hurtled to the top of every CIO’s “must-do” list.
|The oft-mentioned upside of virtualized data centers is the energy, time and money saved. With the technology, however, also comes security and planning issues.|
Unsure About an Acronym or Term?
Now that everyone has caught on to the benefits this technology can deliver, chiefly, the ability to reduce an organization’s data center footprint by a factor of six, eight or 12, the focus is turning to making sure all these efficiencies and cost savings don’t come at the expense of data availability and security.
“We’ve jumped over the chasm without even looking down below,” Don Norbeck, director of product development at SunGard, said in an interview with InternetNews.com. “Luckily, we’ll probably land on the other side. But there’s still plenty of risk.”
These risks, from an organizational standpoint, start long before the first virtualization application is downloaded.
Just like any other software installation, committing to a virtualization project requires not only an appreciation for the technological vulnerabilities inherent in any operating system — like bugs, malware and access control — but also a fundamental understanding of exactly which applications and systems are used the most, which are the most critical to operations, when they’re used, and how to orchestrate the workload of all these applications running on both physical and virtual servers.
For those charged with the responsibility of managing and maintaining one data center or multiple data centers, the temptation to simply initiate a straight-line consolidation — take the workloads running on 100 servers and cram them on to 10 or 15 servers — is alluring. Every company wants to get greener, lower energy consumption, reduce the size of their data centers and have the ability to shift workloads with a simple click of a button.
“Now that everything is in the pool, people just have to push a button,” Norbeck said. “And they’ll keep pushing the button until the button doesn’t work anymore. Without proper planning and provisioning, you’re back to where you were before. How do you audit it? You push a button and then wait to see who starts complaining.”
In other worlds, just as all servers are not created equally, neither are the applications running in a corporate data center.
And the stakes are increasing.
This article was originally published onInternetNews.