GuidesImproving mod_perl Driven Site's Performance -- Part II: Benchmarking Applications Page 3

Improving mod_perl Driven Site’s Performance — Part II: Benchmarking Applications Page 3

ServerWatch content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.




The timeout option defines the number of seconds that the client is
willing to wait to hear back from the server. If this timeout
expires, the tool considers the corresponding call to have failed.
Note that with a total of 27,000 connections and a rate of 150 per
second, the total test duration will be approximately 180 seconds
(27,000/150), independently of what load the server can actually
sustain. Here is a result that one might get:

     Total: connections 27000 requests 26701 replies 26701 test-duration 179.996 s

     Connection rate: 150.0 conn/s (6.7 ms/conn, 

     Request rate: 148.3 req/s (6.7 ms/req)
     Request size [B]: 72.0

     Reply rate [replies/s]: min 139.8 avg 148.3 max 150.3 stddev 2.7 (36 samples)
     Reply time [ms]: response 4.6 transfer 0.0
     Reply size [B]: header 222.0 content 1024.0 footer 0.0 (total 1246.0)
     Reply status: 1xx=0 2xx=26701 3xx=0 4xx=0 5xx=0

     CPU time [s]: user 55.31 system 124.41 (user 30.7% system 69.1% total 99.8%)
     Net I/O: 190.9 KB/s (1.6*10^6 bps)

     Errors: total 299 client-timo 299 socket-timo 0 connrefused 0 connreset 0
     Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0

Benchmarking Response Times with http_load

http_load is yet another utility that does webserver load
testing. It can simulate 33.6 modem connection (-throttle) and
allows you to provide a file with a list of URLs, which we be fetched
randomly. You can specify how many parallel connections to run using
the -parallel N option, or you can specify the number of requests
to generate per second with -rate N option. Finally you can tell
the utility when to stop by specifying either the test time length
(-seconds N) or the total number of fetches (-fetches N).

A sample run with the file urls including:

  http://www.example.com/foo/
  http://www.example.com/bar/

We ask to generate three requests per second and run for only two
seconds. Here is the generated output:

  % ./http_load -rate 3 -seconds 2 urls

  http://www.example.com/foo/: check-connect SUCCEEDED, ignoring
  http://www.example.com/bar/: check-connect SUCCEEDED, ignoring
  http://www.example.com/bar/: check-connect SUCCEEDED, ignoring
  http://www.example.com/bar/: check-connect SUCCEEDED, ignoring
  http://www.example.com/foo/: check-connect SUCCEEDED, ignoring
  5 fetches, 3 max parallel, 96870 bytes, in 2.00258 seconds
  19374 mean bytes/connection
  2.49678 fetches/sec, 48372.7 bytes/sec
  msecs/connect: 1.805 mean, 5.24 max, 0.79 min
  msecs/first-response: 291.289 mean, 560.338 max, 34.349 min

So you can see that it has reported 2.5 requests per second. Of course
for the real test you will want to load the server heavily and run the
test for a longer time to get more reliable results.

Note that when you provide a file with a list of URLs make sure that
you don't have empty lines in it. If you do -- the utility won't work
complaining:

  ./http_load: unknown protocol -

Benchmarking Response Times with the crashme Script

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends & analysis

Latest Posts

Related Stories