Virtualization Deemed 'Disruptive' to HPC

By Amy Newman (Send Email)
Posted Nov 18, 2010


More on supercomputing

Is it really a surprise to anyone that virtualization has a role to play in supercomputing?

Is virtualization really all that new to HPC?

It might have been to those behind SC10, the semi-annual supercomputing show being held this week in New Orleans, La. Each year, the conference cites technologies it believes have the potential to disrupt the HPC landscape, defining a "disruptive technology" as one whose drastic innovations in current practices are such that they have the potential to completely transform the HPC field as it currently stands.

This year, the Disruptive Technologies focuses on new computing architectures and interfaces that will significantly impact the HPC field in the next five to 15 years but have not yet emerged in current systems. Technologies include storage, programming, cooling, productivity software and virtualization.

Brocade was recognized for its virtual cluster switching, which as set out to revolutionize Layer 2 Ethernet, as was VMware for the virtualization options it brings to HPC. System Research Team (SRT) at Oak Ridge National Laboratory, Ohio State University, UnivaUD, and Deopli were cited examples of where this technology has been applied.

Is virtualization disruptive? For run-of-the-mill data center computing, it certainly has been, so it stands to reason that it will have an even greater impact on HPC systems, where standardization and commodity rarely come into play.

Interestingly, virtualization doesn't necessarily mean the same thing in the HPC world as it does elsewhere. As Sandia Labs and ScaleMP have recently demonstrated, virtualization in an HPC environment aggregates as frequently as it provisions. Such aggregation brings to mind grid computing, on-demand computing or utility computing -- terms whose definitions are very similar, though admittedly not quite the same as virtualization. The concept isn't new.

In fact, the concept predates late 20th Century computing buzz and hearkens back to the earliest days of the mainframe, when computing resources were shared and paid for based on how many units were consumed. Which, when one thinks about it, isn't all that different from another cutting edge concept -- cloud computing.

Obviously, the technology itself has gotten spiffier and faster. But much like in the fashion world, everything old is new again. Just as a raised hemline inspires you to purge your closet and hit Nordstroms because your shorter skirts from five years ago aren't cut the same as the ones on the runway, virtualizing your HPC infrastructure will result in more robust hardware and new software. The concepts behind the change, however, are precedented.

Amy Newman is the senior managing editor of Internet.com's server vertical. She has been covering virtualization since 2001, and is the coauthor of Practical Virtualization Solutions, published by Pearson in 2009.

Follow ServerWatch on Twitter

Page 1 of 1


Comment and Contribute

Your name/nickname

Your email

(Maximum characters: 1200). You have characters left.