http://www.perlmonks.org?node_id=324335

crabbdean has asked for the wisdom of the Perl Monks concerning the following question:

I've written a program to traverse our file server and remove temp files and unwanted old directories on users directories. In development and testing it worked fine but on first run on the file server it died by the time it got to users beginning with letter D.

I rewrote a few things, put in a few evals to make sure certains tests completed and re-ran it. Again it died at ABOUT the same point.

On monitoring it I noticed the script was munching memory. I watched in horror as my system slowly ground itself to a halt. The File::Find module is the heart of the program and it does call it recursively. I've since tested and its not the recursiveness that's a problem. I wrote a simple script to just loop using the File::Find module on my own directory (see below) as a test. The script below doesn't munch memory it as fast but still you can see it disappearing (my program is a little more intensive and chews through it faster). Obviously with the size of our file server this solution isn't viable like this.

My only solution at this stage seem to be to either find the leek in the File::Find module or work around it. Anyone else come across this problem?

Anyone aware of a garbage collection module?

Thanks
Dean

#!perl use File::Find; while (1) { print "\n\nStarting again ....\n\n"; sleep 2; find (\&processfiles, "\\\\nwcluster_vol1_server\\vol1\\Users\\Cra +bbD"); } sub processfiles { print "$File::Find::name\n"; }