Don't Cut Corners on Server Defragmentation

By Drew Robb (Send Email)
Posted Sep 18, 2009


The subject of fragmentation has been around for years. It dates back, I believe, to the predecessors to the early VAX machines running Open VMS. At that time it solved a major problem — you could write to files only contiguously. So once your server had some use, you ended up with lots of areas of space that were no longer carrying data and weren't big enough to write new files.

Hard-Core Hardware: Fragmentation may not cut it as a big screen villain, but it remains a threat and handicap to optimal server performance. In this era of massive hard drives and virtualization, minimizing fragmentation is more critical than ever.

So the concept of fragmentation was born — to be able to split up files and write them into whatever spaces were available on the disk. Despite this being a big problem on OpenVMS, the Windows NT developers of incorporated file fragmentation into the heart of Windows, where it remains to this day.

So if you ever wondered why Windows servers and PCs get progressively slower over time, the answer lies in fragmentation. It's also the reason for long boot times, very slow rates of document and application loading, and a higher amount of system errors and freezes. Ironically, the amount of defrag getting done appears to be dropping off as companies are looking to cut costs. It seems they have gotten so used to defrag running in the background, they no longer appreciate the value of a very simple technology.

Big mistake.

I guess it's the nature of the beast. Anti-virus and other security software isn't being cut back much in IT departments. After all, it's difficult to justify the removal of an obvious threat from the annual budget. You see lots of movie plots about the damage done by a virus or other malware; I've yet to see a movie where the villain of the piece was fragmentation. It just doesn't have that same fear factor.

Recent personal experience and some digging around have convinced me anew that defragmentation is a must-have in any server shop. My own machine was experiencing these really annoying slowdowns, and some apps would freeze from time to time. It also took a while to boot up. I ran a fragmentation analysis and was horrified to see how badly cut up the files and free space had become. A quick run of Diskeeper and the change in performance was marked.

Inspired, I decided to check around to see how it was working out in the data center. I found one system manager who believed defrag to be essential for maintaining database performance. He used it on all his servers by setting it up to run in the background.

Another user with an EMC storage environment, as well as Windows servers and databases, keeps a defrag program running despite making heavy use of RAID and virtualization. He had thought that virtualization, lots of memory and the use of RAID might minimize the need for defrag. Not so. He initially ran the software manually on an as-needed basis but now has it set to run automatically so fragmentation never gets too bad.

One interesting application was by a server administrator in a government shop. On any new box, once the initial OS and base applications are installed, he runs a full defrag to make the applications as contiguous as possible. It turns out that even with vast amounts of hard drive space available, Windows loads itself in a very scattered fashion all over the disk. Applications receive similar treatment. By making the software installs contiguous, he sees much faster response times on new servers. Then he keeps the defrag software running in an effort to prevent much fragmentation from taking hold.

The takeaway appears to be this: Fragmentation may not be sexy, but it is very necessary if you want to get the most out of your servers.

Further, in this era of massive hard drives and virtualization, it turns out that fragmentation is more needed than ever. The bigger the hard drive, the more opportunity there is to have files split into thousands or even tens of thousands of pieces. Think about that for a moment — a server is retrieving a file. Instead of one retrieval operation, the disk head must thrash around thousands of times to gather every bit of the file before it can be loaded.

As for virtualization, initial studies suggest virtualization software seems to worsen fragmentation. Obviously, by creating and discarding VMs with great frequency, free space can get hacked up rapidly. But it may go further than that. I'm looking into how VMs are written and whether the combination of Windows plus VMware results in higher levels of fragmentation. That is the working theory — more on that to come.

Drew Robb is a freelance writer specializing in technology and engineering. Currently living in California, he was originally from Scotland where he received a degree in Geology/Geography from the University of Strathcyle. He is the author of Server Disk Management in a Windows Environment (CRC Press).

Follow ServerWatch on Twitter

Page 1 of 1


Comment and Contribute

Your name/nickname

Your email

(Maximum characters: 1200). You have characters left.