Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Re: Tracking down memory leaks

by perrin (Chancellor)
on Apr 13, 2005 at 12:34 UTC ( [id://447342]=note: print w/replies, xml ) Need Help??


in reply to Tracking down memory leaks

Growing and leaking are not the same thing. A perl program can use more memory after running a while even if nothing is wrong. For example, if you load a 10MB file into a scalar, that scalar will hang onto that memory, even if it goes out of scope. You would have to explicitly undef it to get the memory back.

So, your real question is "How can I make my program use less memory?" There are some answers to that in general terms in the Perl documentation and other places. For specific advice, try to narrow down a small section that grows a lot over time, and post it here for help.

Replies are listed 'Best First'.
Re^2: Tracking down memory leaks
by Hena (Friar) on Apr 13, 2005 at 12:47 UTC
    A perl program can use more memory after running a while even if nothing is wrong. For example, if you load a 10MB file into a scalar, that scalar will hang onto that memory, even if it goes out of scope.
    Shouldn't these situations be handled by garbage collection. If a scalar (or array/hash) gets out of scope (eg. subroutines internal variables) they should be freed when memory is needed before asking memory from system?

    AFAIK memory asked from system won't shrink.
      Shouldn't these situations be handled by garbage collection.
      Maybe, but mostly they aren't, for performance reasons:
      sub bla { my $arg = shift; my $big_string = $arg x 1000; }

      Perl will in general keep the memory for $big_string allocated and reserved, because then it doesn't need to allocate the memory again next time the sub is called.

      Explicitly undef()ing or resizing variables before they go out of scope sometimes helps, though - on some systems, it might even free the memory back to the system.

      Usually, you don't need to do this, exactly because the memory gets reused anyway. If your program grows a lot, it's more likely you're using an inefficient algorithm, or you're creating circular references somewhere.

        So basicly if one has a large variable, which should be removed by scoping (not counting the speed usage but memory in should word), one should do manually. But if the large variable is within sub and that sub is used again (eg from a loop structure), then all the variables within it is automatically reused.
        Note that in your example, Perl will keep the memory for $big_string twice:
        #!/usr/bin/perl use strict; use warnings; use Readonly; Readonly my $size => 10 * 1024 ** 2; # 10 Mb. sub show_mem { system "grep ^VmSize /proc/$$/status"; } sub gimme_big { my $size = shift; my $var = 'x' x $size; } show_mem; gimme_big $size; show_mem; __END__ VmSize: 3464 kB VmSize: 23952 kB
        For a 10Mb string, Perl allocates about 20Mb memory.

        undefing the variable makes Perl allocate about 10Mb less:

        #!/usr/bin/perl use strict; use warnings; use Readonly; Readonly my $size => 10 * 1024 ** 2; # 10 Mb. sub show_mem { system "grep ^VmSize /proc/$$/status"; } sub gimme_big { my $size = shift; my $var = 'x' x $size; undef $var; } show_mem; gimme_big $size; show_mem; __END__ VmSize: 3472 kB VmSize: 13716 kB
        So, how do we get rid of the extra 10Mb? By a careful use of string eval:
        #!/usr/bin/perl use strict; use warnings; use Readonly; Readonly my $size => 10 * 1024 ** 2; # 10 Mb. sub show_mem { system "grep ^VmSize /proc/$$/status"; } sub gimme_big { my $size = shift; my $var = eval "'x' x $size"; undef $var; } show_mem; gimme_big $size; show_mem; __END__ VmSize: 3468 kB VmSize: 3468 kB
      That's what I was thinking too. The fact that memory usage slowly grows until the OS kills it is not good. The files I am processing have fairly short lines and it processes the files line by line in a loop. All of the variables inside that loop should be "reusing" the same space, right? Otherwise, what is the point of scoping at all.

      Scott
      Project coordinator of the Generic Model Organism Database Project

        All of the variables in that loop will retain their memory between calls and reuse it. However, if you leave that loop, they will still retain that memory. Also, if you load one of them once with 10MB of data, it will stay at that size until perl exits unless you explicitly undef it when you're done with it.
      No, the memory is not freed. Perl keeps it as an optimization since you would need to allocate it again the next time this chunk of code runs. Of course that doesn't help you any if that code doesn't run again in your program... We have discussed this at length on the mod_perl list and it is covered in the mod_perl docs.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://447342]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (2)
As of 2024-06-24 20:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.