|Perl: the Markov chain saw|
I am using Parse::MediaWikiDump to analyze a static XML dump of Wikipedia. I am reading every page and saving some interesting info about it to a text file. (If it's relevant, i'm running Cygwin Perl 5.10 on Windows XP.)
The memory usage of this program keeps growing quite quickly as i am progressing through the dump, even though i believe that i am not aggregating any info in variables - only in files. Of course, i might be wrong - it is possible that i am aggregating something without noticing. And maybe Perl's garbage collector isn't doing its job. And maybe some internal variable in Parse::MediaWikiDump is aggregating data.
I can start sprinkling Devel::Size::size() calls around the code, but that would be rather annoying, because if i understand correctly it only works per variable, which means that i'll have to write such a line for every variable, and i have a lot of them, not to mention the variables in the external module.
Is there any convenient tool which can produce a detailed list of all the memory usage of a Perl program during runtime?
I also tried periodically checking the value of Devel::Leak::NoteSV(), and it indeed keeps growing, but i don't really what can i do with it.
Thanks in advance for any help.