I am deploying a website where the the user is authenticated before the content is served. I need to report on site usage every month so that we can track usage, spot trends etc.
Normally, I would examine the server logs and extract the information. The problem is that the website is deployed in a shared environment on clustered server and the log files are rotated daily. I would prefer to write the log information to a SQL database so that I can run the reports at any time.
I have read that Apache can be configured to pipe it's log info to an external application so my plan was to use a Perl app to filter the results and do the work. Basically, I will take the log info (timestamp, user ID and resource requested) and write them to an Oracle server as soon as I can.
I am looking for any advice on how to do this without impacting the speed of Apache. Can I read and write a line at a time or do I need to cache log entries and write by block? Will I benefit from multi threading the app such that I can manage the read and write operations seperately? Am I overlooking another way to achieve the underlying task?
Advice and comments welcome!