|Perl: the Markov chain saw|
A different kind of Spreadsheet::ParseExcel memory problem.by jcoxen (Deacon)
|on Sep 09, 2005 at 17:10 UTC||Need Help??|
jcoxen has asked for the
wisdom of the Perl Monks concerning the following question:
I'm writing a script that parses certain information out of a series of Excel files. The logic to read the files and grab the data I need works fine. My problem is that I keep running out of memory. I'm running into what appears to be a memory leak that manifests itself when I open more files. I'm processing the files sequentially so I would expect that any memory allocated when I open a file would be freed when I open the next but that's not the case. I'm losing 1-2 Kb of memory with each new file. This hasn't been a problem with other ParseExcel scripts I've written since I was only working on 1 or 2 files. Now I'm dealing with over 800 files and it definitely is a problem.
Various searches have come up with alternative cell handling techniques to limit memory usage when parsing large files. I tried that just to make sure but it had no effect. The files I'm working with aren't that large, there's just a lot of them.
The pertinent sections of the code are listed below