sherab has asked for the
wisdom of the Perl Monks concerning the following question:
I have a perl script that spend its days reading in files and processing them.
We get files that range from tiny to, in some cases, 90megs. I know that memory allocation is somewhat elastic but my question is about what happens when my script has been reading 90k files all morning and then at around 11am it reads in a 90meg monster and then goes back the rest of the day reading in 90k files. Is all the memory that it took up for the 90meg instance still being consumed for the rest of day? I assume that it does keep it since Perl only releases that memory after the process has exited and the process is running all day.
I also see that my perl is compiled with "usemymalloc=n".
I would welcome any insights someone has on this or any input if you've experienced this before. It would be great somehow if that memory could be re-released back into the wild.
UPDATED: THANK YOU so much monks! A lot of great great answers!