Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW

Re: File::Find memory leak

by samtregar (Abbot)
on Jan 27, 2004 at 04:08 UTC ( #324346=note: print w/replies, xml ) Need Help??

in reply to File::Find memory leak

There's no such thing as a "garbage collection" module. Perl does its own garbage collection using reference counting and if something's getting lost there's not much you can do about it (aside from fixing the leaky code).

If you can't find and fix the leak you'll probably have to fork() a sub-process to do whatever leaks, pass the results up to the parent via a pipe or temp file and then exit() the child. When the child exits any memory it used will be reclaimed by the operating system. I've used this technique before with leaky Perl modules. Give it a try and post again if you have trouble.


PS: The above suggestion assumes you're working on a Unix system. I imagine things are different in Windows-land, where fork() is emulated with threads and exit() probably doesn't free memory.

Replies are listed 'Best First'.
Re: Re: File::Find memory leak
by crabbdean (Pilgrim) on Jan 27, 2004 at 04:21 UTC
    Thanks Sam, that was exactly my thinking. Great minds! If the fork doesn't work, a simpler and possible alternative is to write a main script that does all the logging and a second script is called each time it traverses a users directory which contains the "File::Find" module. This will keep it constantly freeing memory it uses. I'll let you know the results.

    The "perltodo" manual page says some garbage collection work is still to be done in future for perl.


Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://324346]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (7)
As of 2017-05-24 11:25 GMT
Find Nodes?
    Voting Booth?