Well, the process is multithreaded. So even though over a day I might generate 100 MB at most at the top logging level, after 40 - 50 forks we're talking about 4GB - 5GB a day. This has compounded the problem because I'm trying to keep the logs sorted and rotated. And, even though I can turn down the detail, the bugs I am finding require a high level of detail for testing.
(The script is a web spider. Most of the bugs I encounter with it involve bizarre / broken HTML in web pages. Problem is that in order to figure out just what is going on I want to log lots of info if there are any anomalies. The problem becomes how to do that without being too processor intensive)
Want to support the EFF and FSF by buying cool stuff? Click
here.