Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: Long running tasks, perl and garbage collection

by grinder (Bishop)
on May 21, 2009 at 14:59 UTC ( #765472=note: print w/ replies, xml ) Need Help??


in reply to Long running tasks, perl and garbage collection

In any programming language, memory leaks are the bane of long running processes. You might be using a library that leaks. The library on your platform might be sane, but it leaks on another.

A master class programmer will take this into account and arrange to push as much processing as possible into transient processes. Apache and Postfix are two exemplars of this design.

mod_perl also behaves this way. Looking at one of my sites, I see that the mod_perl controller was started on the 1st of May, and it's occupying about 50Mb. The oldest worker process is about 5 hours old and clocks in at about 105Mb. The youngest are about 30 minutes old and haven't risen much beyond the initial 50Mb. By tonight they will all have been reaped, and the RAM recycled. There may be some crappy CPAN code that leaks like a sieve in there, but as far as I am concerned it is below the radar, thanks to mod_perl's controller/worker architecture.

This approach is simple to put into practice and surprisingly robust.

• another intruder with the mooring in the heart of the Perl


Comment on Re: Long running tasks, perl and garbage collection

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://765472]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others contemplating the Monastery: (11)
As of 2014-07-28 10:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (195 votes), past polls