|There's more than one way to do things|
From an overnight test run, my process consume a lot of memory. Could this ben Find:Find ? I suspect it's more to do with the image processing and creation of image objects which are not being released.
I've downloaded and read the source of the current File::Find, and except for an explicitly coded stack that I expected to be implicit, nothing unusual happens. I expect most memory usage inside File::Find to be the stack for descending the directory tree and the per-directory array of directory contents (i.e. readdir results). So unless you have a very deeply nested directory tree, where each directory is filled with millions of files, it is very unlikely that File::Find is the root of your memory problem.
Disable the image processing in your code (insert a return as first line of the wanted function if you have no better idea) and run it again. Watch the memory use. If it still consumes large amounts of memory, you likely have found a problem in File::Find. If not, look at your image processing code. Try to explicitly destroy the Image::Magick objects you created, i.e. $imageObject='';
Perhaps Image::Magick leaks some memory. I don't know, seach for yourself. If it leaks too much memory, you could move the actual image processing into a separate process that releases all leaked memory at its end. Something like this inside your wanted function should do the trick (untested code):
Note that forking a sub-process has its own costs. Also note that fork and waitpid depend on the platform. They are not natively available on Window, Perl uses an emulation based on threads there. While the perl port makes your script think that it forked a new process, it just created a new thread, and leaked memory will not be freed until your script ends. So this trick will most likely not work on Windows.
Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
In reply to Re^5: Recursive image processing (with ImageMagic)