Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Clearing memory

by sunadmn (Curate)
on Apr 13, 2004 at 16:19 UTC ( [id://344782]=perlquestion: print w/replies, xml ) Need Help??

sunadmn has asked for the wisdom of the Perl Monks concerning the following question:

Greetings fellow monks I have a script that uses Net::DNS::ZoneFile & DBI to parse zone files and insert given records into my mySQL DB, but I have an issue that seems it might be memory related. When the script is ran it chugs along fine and then gets to a point and locks for what seems like forever ( 5 to 10 minutes also I see cpu usage at 98% or better when this happens ) now I think this may be due to it not clearing memory out, but am unsure. What I would like to do is make this script read 10 files dump the memory and then start from that stop point read the next 10 files and do the same till the end. I am not very sure of how to do this so I come to you for advice. code follows:
#!/usr/bin/perl use strict; use warnings; use File::Find; use FileHandle; use Net::DNS::ZoneFile; use DBI; my $dbh = DBI->connect( "DBI:mysql:database=dns;host=localhost", "root", "bla", {'RaiseError' => 1} ); my $sth = $dbh->prepare( "INSERT INTO a (hname,zone,host) VALUES (?,?,?)" ); my $base = "/chroot/named/master/net"; find(\&wanted, $base); $dbh->disconnect(); exit; sub wanted { return unless ( -f "$File::Find::name" and $File::Find::name !~ m! + /in-addr/! ); my $root = shift; my $zone = new FileHandle "$File::Find::name", "r"; my @name = split('/',$File::Find::name); my $out = join('.', reverse(@name[4..$#name])); my $rrset = Net::DNS::ZoneFile->readfh($zone, $root); for(@$rrset) { # print "$out\n"; # print "$_->{type}\n"; # print "$_->rdatastr\n" if $_->{type} =~ m/^A/; # print "$_->{name},",$_->rdatastr, "\n" if $_->{type} =~ m/^A/ +; if($_->{type} =~ m/^A/) { print "Working on file:\t$out\n"; $sth->execute( $_->{name}, $out, $_->rdatastr) or die $dbh-> +errstr; print "Finished work on file:\t$out\n"; } } }

Replies are listed 'Best First'.
Re: Clearing memory
by perrin (Chancellor) on Apr 13, 2004 at 20:05 UTC
    If you're concerned about memory, have you checked how much memory the process is using? How much is free on the box? If it goes into swap, that will certainly slow things down.
      From the OP I don't understand if the difference between CPU usage and memory usage is clear to the poster. If this difference is clear I apologize for telling something you already know.

      CPU usage and memory usage are seperate entities. This code:
      perl -e 'for(;;) { }'
      Will use close to 100% CPU-usage while using little memory (for a perl-process at least), this is because the processor is constantly wants a slice of processor time, to execute something close to nothing. In the 'top'-output the memory use is in the SIZE and RES columns:
      PID USERNAME PRI NICE SIZE RES STATE TIME WCPU CPU COMMA +ND 38691 eXile 64 0 2160K 1444K RUN 1:30 92.84% 92.04% perl5 +.8.2
      While this code:
      perl -e 'for(;;) { sleep 100 }'
      will almost use no CPU-cycles, and almost the same amount of memory. Because this process will only ask for processor time every 100 seconds:
      PID USERNAME PRI NICE SIZE RES STATE TIME WCPU CPU COMMA +ND ... 38692 eXile 10 0 2160K 1480K nanslp 0:00 0.00% 0.00% perl5 +.8.2
      I think in Win32 the process-manager (or the 'top' provided by cygwin) should be able to display this type of information as well. No experience on other OSses.
Re: Clearing memory
by matija (Priest) on Apr 13, 2004 at 18:56 UTC
    If you insist on processing only ten files at a time, I think your best bet is to break the script apart into two pieces: One that processes those filenames that it gets on the command line and one that does the find, gets the list of filenames and calls the other script for groups of ten files.
Re: Clearing memory
by Wassercrats (Initiate) on Apr 14, 2004 at 01:49 UTC

    I've had similar problems. It seemed like the 5-10 minute wait couldn't be due to normal processing time. I never figured it out, but in my case memory consumption skyrocketed and virtual memory was automatically increased, so maybe the slowness of using the hard drive's memory was the issue (maybe that's what swapping is??)

    I found Process Explorer which might help debug these things, but I haven't tried it yet.

    I just learned that one way to clear memory is to put the code in a package and use reset. This preserves system variables.

Re: Clearing memory
by Anonymous Monk on Apr 14, 2004 at 11:40 UTC
    Its probably a bug in Net::DNS::ZoneFile, but it could be that you're somehow leaking filehandles, so it can't hurt to close $zone when you're done with it.
Re: Clearing memory
by Anonymous Monk on Apr 14, 2004 at 11:43 UTC
    Why are you using FileHandle? Net::DNS::ZoneFile will open the file for you.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://344782]
Approved by coreolyn
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (7)
As of 2024-03-19 11:48 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found