Read more on "Cloud Computing" »

Interop: From Server Virtualization to Private Cloud

LAS VEGAS - Moving from bare metal to virtualization and private cloud server infrastructures is not always an easy task.

In a panel discussion at Interop this week, vendors and customers discussed some lessons learned along the path of server virtualization.

Panelist Mike Leeper, Director of Global Technology at Columbia Sportswear, shared the gory details on how his group went from bare metal to near complete virtualization inside of the last three years.

For Columbia Sportswear, it all started off in early 2010, when Leeper was tasked with building a new datacenter and a new Disaster Recovery system. It's a journey that eventually led Leeper to a virtualized vBlock system from VCE, which is a joint venture between Cisco, EMC and VMware.

Server DatacenterFor Leeper, the key was that he needed to be able to easily advance from standard VMware server virtualization to the private cloud world. The private cloud adds in the ability for rapid auto-deployment and self-provisioning.

"I'm not going to be able to add value by designing a new garment, but what I can do is make sure my private cloud can meet the demands of the business today," Leeper said. "So when we have a new product, we now have a platform that we can rapidly deploy and that's how I add value to the business, and private cloud is the heavily lifting that enables that."

Rodrigo Flores, Cloud Enterprise Architect at Cisco, commented that what the Private Cloud model enables over just server virtualization is a service catalogue approach. So instead of building custom virtual servers as needed, there is a pre-built menu that IT can allow users to provision from.

"It's a big transformation," Flores said. "People want their entire stack, not just a VM with an operating system."


For Leeper, the biggest challenge in moving to a fully virtualized private cloud wasn't the technology, it was the people.

"For us to get to a high 90 percentile virtualized environment, we had to convince people we could run workloads under virtualization," Leeper said. "DBAs didn't think could meet the performance needs."

So what Leeper did was he gave the database guys all the resources they wanted with lots of RAM and compute. Then once he got them up and running, he showed them what they were actually consuming. As it turns out, what the database guys thought they needed and what they actually used were two different things.

"We had a more data-rich mature conversation with users of the platform," Leeper said.

Flores suggested that it's a good idea to start small when it comes to moving to a full private cloud.

"Crawl before you can walk," Flores said. "Failures are often due to process maturity and people; you can't put that in a box."

Sean Michael Kerner is a senior editor at ServerWatch and InternetNews.com. Follow him on Twitter @TechJournalist.

Follow ServerWatch on Twitter and on Facebook

This article was originally published on May 9, 2013
Page 1 of 1

Read more on "Cloud Computing" »
Thanks for your registration, follow us on our social networks to keep up-to-date