Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Re: Perl and memory usage. Can it be released?

by kennethk (Monsignor)
on Feb 07, 2014 at 16:58 UTC ( #1073902=note: print w/ replies, xml ) Need Help??


in reply to Perl and memory usage. Can it be released?

As I understand, the perl process holds onto any heap memory it gets allocated (I could be wrong), so yes, in your case it's going to always have the memory footprint of the large use case. There are couple approaches that might help ameliorate this for you:

  1. Can you modify your file parsing so it's streaming instead of slurping? Just because you need to process 90 MB doesn't necessarily mean you need to hold onto 90 MB of data.

  2. Can you combine the above with a database? For example, by using an SQLite database, you should be able to avoid a large memory footprint for perl while still maintaining access to the data. You could swap that to an in-memory database if file access times become prohibitive, but I'm unclear as to whether that would create a permanent memory footprint.

  3. Finally, you could have a parent process that forks, and the children parse your files. That way, when the child is reaped, the memory is recovered.

See also http://stackoverflow.com/questions/9733146/tips-for-keeping-perl-memory-usage-low.


#11929 First ask yourself `How would I do this without a computer?' Then have the computer do it the same way.


Comment on Re: Perl and memory usage. Can it be released?

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1073902]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others meditating upon the Monastery: (7)
As of 2014-08-23 02:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The best computer themed movie is:











    Results (171 votes), past polls