http://www.perlmonks.org?node_id=1073902


in reply to Perl and memory usage. Can it be released?

As I understand, the perl process holds onto any heap memory it gets allocated (I could be wrong), so yes, in your case it's going to always have the memory footprint of the large use case. There are couple approaches that might help ameliorate this for you:

  1. Can you modify your file parsing so it's streaming instead of slurping? Just because you need to process 90 MB doesn't necessarily mean you need to hold onto 90 MB of data.

  2. Can you combine the above with a database? For example, by using an SQLite database, you should be able to avoid a large memory footprint for perl while still maintaining access to the data. You could swap that to an in-memory database if file access times become prohibitive, but I'm unclear as to whether that would create a permanent memory footprint.

  3. Finally, you could have a parent process that forks, and the children parse your files. That way, when the child is reaped, the memory is recovered.

See also http://stackoverflow.com/questions/9733146/tips-for-keeping-perl-memory-usage-low.


#11929 First ask yourself `How would I do this without a computer?' Then have the computer do it the same way.

  • Comment on Re: Perl and memory usage. Can it be released?