Beefy Boxes and Bandwidth Generously Provided by pair Networks DiBona
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Re: File::Find memory leak

by BrowserUk (Pope)
on Jan 27, 2004 at 06:11 UTC ( #324360=note: print w/ replies, xml ) Need Help??


in reply to File::Find memory leak

Using 5.8.2 (AS808) on XP, and processing a little over 200_000 files, I see a growth pattern of around 22k per iteration, or maybe 10 bytes per file.

If I fork each iteration of the search, the growth appears to be increased slightly to 31k/iter of 205428 files.

Doing a crude comparision of heap dumps taken before & after an iteration, it appears as if the leakage isn't due to something not being freed, but rather to fragmentation of the heap, as larger entities are freed and their space half re-used for smaller things, thereby requiring the heap to grow the next time the larger entity needs to be allocated.

Note: The comparision was very crude...with something like 12000 individual blocks on the heap, it had to be:)

Having the script exec itself after each iteration does stop the growth, but whether that is practical will depend upon the nature and design of your program.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
Timing (and a little luck) are everything!


Comment on Re: File::Find memory leak

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://324360]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (6)
As of 2014-04-17 22:21 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    April first is:







    Results (458 votes), past polls