ServersApache Guide: Logging, Part 4 -- Log-File Analysis Page 3

Apache Guide: Logging, Part 4 — Log-File Analysis Page 3

ServerWatch content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.




The Analog web site
(http://www.statslab.cam.ac.uk/~sret1/analog/)
claims that about 29 percent of all web sites
that use any log analysis tool at all use Analog.
They claim that this makes it the most popular
log analysis tool in the world. This fascinated
me in particular, because until last week, I had
never heard of it. I suppose that this is because
I was happy with what I was using, and had never
really looked for anything else.

The example
report, which you can see on the Analog web site,
seemed very thorough, and to contain all of the
stats that I might want. In addition to the pages
and pages of detailed statistics, there was a
very useful executive summary, which will
probably be the only part that your boss will
really care about.

WebTrends

Another log analysis tool that I have been
introduced to in the past few months is
WebTrends. WebTrends provides astoundingly
detailed reports on your log files, giving you
all sorts of information that you did not know
you could get out of these files. And there are
lots of pretty graphs generated in the report.

WebTrends has, in my opinion, two counts
against it.

The first is that it is really
expensive. You can look up the actual price on
their web site.
(http://www.webtrends.com/default.htm)

The
other is that it is painfully slow. A 50MB log
file from one site for which I am responsible
(one month’s traffic) took about 3 hours to grind
through to generate the report. Admittedly, it’s
doing a heck of a lot of stuff. But, for the sake
of comparison, the same log file took about 10
minutes using WWWStat. Some of this is just the
difference between Perl’s ability to grind
through text files and C’s ability. But 3 hours
seemed a little excessive.

WWWStat

Now that I’ve mentioned it, WWWStat is the
package that I’ve been using for about 6 years
now. It’s fast, full-featured, and it’s free.
What more could you want. You can get it at http://www.ics.uci.edu/pub/websoft/wwwstat/ and there is a companion package (linked from that same page) that generates pretty graphs.

It is very easy to automate WWWStat so that it generates your log statistics every night at midnight, and then generates monthly reports at the end of each month.

It may not be as full-featured as WebTrends, but it has given me all the stats that I’ve ever needed.

Wusage

Another fine product from Boutell.com, Wusage is now in version 7. I’ve used it on and off through the years, and have always been impressed by not only the quality of the software but also the amazing responsiveness of the technical support staff.

You can get Wusage at http://www.boutell.com/wusage/

Or, You Can Do it Yourself

If you want to do your own log parsing and reporting, the best tool for the task is going to be Perl. In fact, Perl’s name (Practical Extraction and Report Language) is a tribute to its ability to extract useful information from logs and generate reports. (In reality, the name ”Perl” came before the expansion of it, but I suppose that does not detract from my point.)

The Apache::ParseLog module, available from your favorite CPAN mirror, makes parsing log files simple, and so takes all the work out of generating useful reports from those logs.

For detailed information about how to use this module, install it and read the documentation. Once you have installed the module, you can get at the documentation by typing perldoc Apache::ParseLog.

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends & analysis

Latest Posts

Related Stories