Pathologically Eclectic Rubbish Lister | |
PerlMonks |
Hunting a memory eaterby McA (Priest) |
on Feb 27, 2014 at 13:59 UTC ( [id://1076397]=perlquestion: print w/replies, xml ) | Need Help?? |
McA has asked for the wisdom of the Perl Monks concerning the following question: Hi all, a process checking DNS names unexpectedly has eaten 12GB of memory, forced swapping and ringing almost all alarm bells of system monitoring. So I was really surprised. DBI needed 1GB to get all results from the MySQL database into the process. That was expected for some millions of rows and ok. But afterwards I wanted to check the DNS name in a fetching loop. The used process memory grew and grew what I really didn't expect in this way. Now I reduced the whole thing to the following snippet which is more or less literally taken from the man page of Net::DNS. Combined with the very usful package Test::LeakTrace I have the following:
The result is that Test::LeakTrace reports MANY leaks. Following the documentation of Test::LeakTrace I gave Net::DNS a chance in a way that one-shot-costs are ok but iterating over meanwhile instantiated/cached code should not need additional memory. To prove this I have a for-loop around it which can be given on the command line. Together with the following little shell script:
I get the following output:
You see that the leaked memory rises with the itaration count. This seems to me very bad. Now to you monks: I would be intersted to see the result on different plattforms with different Perl versions. If you have time and pleasure to test it on your Perl I would be really interested in the results. By the way: If someone knows a package for DNS lookups not easting the memory of my machine should tell it. :-) Does anybody know some documentation how to read the verbose output of Test::LeakTrace (which is not used in this script)? Thank you in advance Regards
Back to
Seekers of Perl Wisdom
|
|