toadi has asked for the wisdom of the Perl Monks concerning the following question:


I had to write a simple cgi that does page-access counting, seeing from where vistors came and unique hits.

my solution was a SSI executed whith each page accessed. BAD, server-load was to high(I work for a large ISP).

Are there better methods/tools to do the job??????

Replies are listed 'Best First'.
Re: CGI and Server-load
by davorg (Chancellor) on Jun 22, 2000 at 16:48 UTC
Re: CGI and Server-load
by lhoward (Vicar) on Jun 22, 2000 at 17:02 UTC
    If you really need to do the analysis online in real-time and not as batch logfile anaysis (as mentioned above) I see 2 posibilities:
    1. Real-time log analysis. Have a process that tails the access logs and counts hits in real-time. This saves the overhead of having to launch a separate process each time you want to increment a count.
    2. Use mod_perl. With mod_perl you can run perl inside of apache so you don't have the overhead of launching a separate process. With Apache::Registry you can even use your current cgi-bin perl programs with no modification and get the "run inside of apache w/o a separate process" advantage.
      mod_perl would remove the process launch overhead, but he's using IIS - mod_perl just works with Apache doesn't it?
      I think the Real-time option is the better one - the server already does logging for you so it seems to me like a good idea to use the information that is provided. I don't know what logging IIS has but there must be something.
      I would be tempted to use a cron job (or similar) and grab the stats on a daily basis.
        Yes, mod_perl only works with Apache. The original question didn't state anything about IIS so I assumed apache (I should have mentioned that assumption in my original post). I think its a safe bet that most perl users are using Apache instead of other web servers.

        Of course, depending on how the script is called the user can set up his script on a separate, dedicated "hit counting" box running Apache (easy to repuppose an old PC as a linux box for this purpose in many environments) he would be set. Or maybe use the first of those new sun boxes they're getting to build a new hit-counter box. Good way to test the new hardware and you don't have to migrate the whole system to sun to get it to work.

Re: CGI and Server-load
by t0mas (Priest) on Jun 22, 2000 at 16:49 UTC
    What is the purpose of the script? Will it display info on the pages?
    If it's just for counting and "off-line" analyzing, I would go for a Custom Log (you do run Apache, don't you :) and do/find a log-analyzis script.
    Maybe mod_usertrack (i.e if you run Apache) is what you need.

    /brother t0mas
Re: CGI and Server-load
by toadi (Chaplain) on Jun 22, 2000 at 16:58 UTC
    we are making the transition from NT to Sun Solaris.
    But for the moment I needed a fast way to do this.

    I already wrote a script to parse Apache-logs(well hell I'm a linux-adept).
    But now it is IIS. I know it's bad but for the moment I have to deal with it. But we're no smal company and the load is BIG(that's why the sunnies are comming).

    Is there a better solution then the one I had!!!

    Off cource there are companies like sitestat, but I like to do it myself!!!
      If you like to do it yourself, then do it. ;-)
      If you like to get a little help with the start, there are lots of analyzers at the list at Some of them are written in perl and claim work with IIS. Have a look at the WWWstats app.

      /brother t0mas
Re: CGI and Server-load
by cbraga (Pilgrim) on Jun 22, 2000 at 19:12 UTC
    Would it be any easier on the server if you used ASP?

    I think all have to do is to include a tiny amount of ASP (or PHP, or PERL) code in each page, which would send a cookie to every user. If, when the script ran, the cookie was already set, then the visit was not unique. If it was not already set, then the visit was unique and you'd increment the counter in a database or elsewhere.

      Lot's of people don't accept gratuitous cookies from random sites. I personally click ok to every cookie that gets set on my system. If people are rejecting cookies then this isn't going to work.

      nuance pet hate of the day: I really hate sites that try to set cookies every time you reference anything (images etc.). You have to click no about 30 times every time you change a page on their site. I always aviod sites like this.