Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Hunting a memory eater

by McA (Priest)
on Feb 27, 2014 at 13:59 UTC ( [id://1076397]=perlquestion: print w/replies, xml ) Need Help??

McA has asked for the wisdom of the Perl Monks concerning the following question:

Hi all,

a process checking DNS names unexpectedly has eaten 12GB of memory, forced swapping and ringing almost all alarm bells of system monitoring.

So I was really surprised. DBI needed 1GB to get all results from the MySQL database into the process. That was expected for some millions of rows and ok. But afterwards I wanted to check the DNS name in a fetching loop. The used process memory grew and grew what I really didn't expect in this way.

Now I reduced the whole thing to the following snippet which is more or less literally taken from the man page of Net::DNS. Combined with the very usful package Test::LeakTrace I have the following:

#!/bin/env perl use strict; use warnings; use strict; use 5.010; use local::lib './lib'; use Net::DNS; use Test::LeakTrace; die("ERROR: Please provide Loop count > 0 as first argument") unless $ +ARGV[0]; leaktrace { for (1..$ARGV[0]) { my $res = Net::DNS::Resolver->new; my @mx = mx($res, "example.com"); } } ;

The result is that Test::LeakTrace reports MANY leaks. Following the documentation of Test::LeakTrace I gave Net::DNS a chance in a way that one-shot-costs are ok but iterating over meanwhile instantiated/cached code should not need additional memory. To prove this I have a for-loop around it which can be given on the command line. Together with the following little shell script:

for iterations in 1 2 3 4 5 6 7 8 9 10; do perl script.pl $iteration +s 2>&1 | wc -l; done

I get the following output:

1374 1492 1612 1736 1859 1982 2106 2229 2353 2477

You see that the leaked memory rises with the itaration count. This seems to me very bad.

Now to you monks: I would be intersted to see the result on different plattforms with different Perl versions. If you have time and pleasure to test it on your Perl I would be really interested in the results.

By the way: If someone knows a package for DNS lookups not easting the memory of my machine should tell it. :-) Does anybody know some documentation how to read the verbose output of Test::LeakTrace (which is not used in this script)?

Thank you in advance

Regards
McA

Replies are listed 'Best First'.
Re: Hunting a memory eater
by tobyink (Canon) on Feb 27, 2014 at 14:20 UTC

    Not that this really answers your question, but are you using an up to date version of Net::DNS? The changelog notes some bugs pertaining to memory leaks (e.g. RT#84601, RT#81942) having been fixed recently.

    use Moops; class Cow :rw { has name => (default => 'Ermintrude') }; say Cow->new->name

      Hi tobyink,

      that was a very valuable hint (sorry, can't do more than one ++). I really assumed that I use the newset version. Sometimes I hate myself for my (not proved) assumptions.

      So, fact is, I didn't use the newest version. The output above is based on version 0.72 of Net::DNS.

      So, for the protocol I installed the newest version 0.74 and let the whole thing run again. This looks much better now even I can't explain where the fixed costs in terms of base leaks come from. The output now is:

      1240 1234 1234 1234 1233 1234 1233 1234 1234 1233

      That means I do have a kind of usage penalty, but it seems that the consumption raises not with the count of lookups. I'll see when I lookup different domains.

      Once again: Thank you for pointing out the obvious, which I haven't done: Look at the newest version and have a look at the bug history. Be sure, I will make atonement for that as it's expected from a monk. ;-)

      Best regards
      McA

Re: Hunting a memory eater
by oiskuu (Hermit) on Feb 27, 2014 at 17:26 UTC

    Why do you create a separate resolver object for each query?

      Hi,

      I don't do it this way in my code. But a cpan module, I'm using is using Net::DNS in more or less this way. While searching for the culprit I could reduce it to the snippet shown by me.

      So, you're absolutly right that someone should instantiate one resolver object and then use this resolver for 1..n lookups.

      In terms of memory usage I would expect that all memory gets freed when the reference to the object gets out of scope.

      Regards
      McA

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1076397]
Approved by marto
Front-paged by toolic
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (7)
As of 2024-03-19 11:48 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found