- 1 Red Hat Enterprise Linux 7.2 Enters Beta with Improved Container Support
- 2 VMware CEO Pat Gelsinger Gives VMworld 5 Imperatives for Success
- 3 VMware vSphere Integrated Containers Previewed at VMworld
- 4 Worldwide Server Revenues Top $13.5 Billion in 2Q15
- 5 Blue Box OpenStack Lands on IBM Softlayer Servers
Web Server Benchmarks/WebServer Compare
Benchmarking software, including Web servers, offers a tantalizing paradox: Measurements are both concrete and tangible, yet at the same time vague and ambiguous. Benchmarks hand us hard numbers, but what do these numbers mean? Often very little, in a broad sense, or sometimes something very specific in a narrow sense.The Yin and Yang of Benchmarks Benchmarking software, including Web servers, offers a tantalizing paradox: Measurements are both concrete and tangible, yet at the same time vague and ambiguous. Benchmarks hand us hard numbers, but what do these numbers mean? Often very little, in a broad sense, or sometimes something very specific in a narrow sense.
Web servers are not unlike zoo animals, quite reliant on their habitant. It's simply unrealistic to consider the performance of a specific Web server without regard to its operating environment -- both its software environment, or operating system, and its hardware environment, or the physical components that make up the machine. All of these factors can dramatically influence the final numbers -- how quickly and nimbly a Web server can respond and deliver to network requests.
What's more, benchmarks can vary widely in how they measure a server. A server's efficiency may also vary depending on the task (e.g., whether its delivering static content to many simultaneous visitors or generating dynamic content).
The upshot of all this is that Web server benchmarks must be considered in a very limited manner. A benchmark for a server running on a particular operating system, on a particular hardware configuration, focuses only on that specific setup. The further the setup deviates from the tested habitat, the less meaningful the benchmark becomes.
Measuring Common Sense
Where hard numbers cannot tell the whole story, it is best to use common sense to fill in the gaps. Products integrated in homogenous environments are often easier to bring up to speed. For example, one might recommend Microsoft IIS (Internet Information Server) as a good choice for an organization with an all-Microsoft infrastructure. In such an environment, it may well produce the best benchmarks. On the other hand, common sense tells us that a Microsoft Web server alone is not the only way to run a high performance Web site, considering the wide deployment of both Netscape and especially the free Apache Web server on extremely high volume sites. Yet, whereas Apache is an awkward fit in a Microsoft infrastructure, it wears like a comfy and warm autumn sweater in Unix environments, where it is most commonly found.
The most valid approach to using benchmarks is within a single habitat: your own. We have found the benchmarking tools listed below to be helpful when testing different Web servers in various environments. When tested on the same operating systems, hardware, and network connections, these benchmarks can reveal important differences between Web servers used in a given system.
Vendor and third-party sponsored benchmarks provide a starting point, but they do not replicate your system. Only your system is your system, but these tools can help you generate some first-hand benchmarks of your Web server's performance.
ZD Labs freely distributes WebBench 3.0, software you can run on your system to test its performance and use to compare to results obtained for other servers.
Mindcraft's WebStone is a downloadable, free benchmarking system.
SPECWeb99 is a commercial- grade benchmark system that sells for between $200-$800 depending on qualified pricing.
Web Server Benchmarks Around the Net
Presumably, these benchmarks have been performed without an intentional bias.
- PC Magazine surveys popular Web servers using its own ZDNet WebBench 3.0 benchmarking tool.
A number of vendors have published or funded their own performance test results. Although these are sometimes noteworthy, keep in mind that vendor-sponsored benchmarks have a remarkable tendency to flatter the vendor's own products.
- A number of vendors including IBM and Compaq have submitted results of their own benchmarks, using SPECWeb99, to the SPEC Web site.