http://www.perlmonks.org?node_id=242696

sureshr has asked for the wisdom of the Perl Monks concerning the following question:

I am using hash reference to pass between function. Although I use delete, assing to void =(), and try all of them in the function in which I receive the reference the memory does seem to get freed up. Is there some way of explicity telling the interpreter to forcibly free what is really free?

Or does the hash reference needs to freed in a different way?

sample code snapshot: sub f2 { my $ht_ref = shift; my %ht = %$ht_ref; print "some val = ", $ht{1}, "\n"; %ht=(); undef %ht; } sub f1 { my %ht=(); my $i; for ($i=0; $i<0xfffff; $i++) { $ht{$i} = $i; } f2 (\%ht); %ht=(); undef %ht; } f1(); while (1) { # sleep here and monitor the memory. memory # does'nt seem to come down :(( sleep (5); }
thanks,
-sureshr

Replies are listed 'Best First'.
Re: Memory leak when using hash 'references'
by Tanalis (Curate) on Mar 13, 2003 at 14:42 UTC
    If I remember this correctly, and assuming it's the system reports of free memory you're watching, you can't release memory that's been allocated to an application back to the system until the process itself terminates.

    The memory you're saving will be freed up, but it will only be available to the Perl process that's running the script until it either performs an exec and terminates, or exits normally.

    This isn't actually a memory leak in Perl - it's just how the system itself handles memory management.

    Hope that helps ..
    -- Foxcub
    A friend is someone who can see straight through you, yet still enjoy the view. (Anon)

    UPDATE: Clarifications.

      I would have assumed so, provided if the memory continued to stay at a peak value. Instead, I am seeing that a memory of few kbs being freed every now and then and a continual increase in the memory (in Mbs), though my application is no longer sucking in any data.
      My appln sucks in data to an internal Q and processes them later, which is why there is increase in term of Mbs. But after every batch of processing, I free up the whole memory I had used, which should eventually result in 'free memory' in the interpreter's space. But the observation is that the memory increase is steady, which I would expect to be at a steady state atleast. This basically concerns me that Perl is not using up 'its free' memory, before requesting for more :(
      -sureshr
        I'm not sure if this is the case, but don't forget that Perl doesn't destroy anything until all references to it are either destroyed or undefed.

        To free up the memory associated with the data, and release it back to Perl, you need to make sure that you also set the original variables that store the data to be undef, and ensure you have no other references to that item that could be keeping it defined.

        Just thinking about scoping, both of the variables you undef in your subs are localised with my anyway, and hence will be automatically destroyed when that sub finishes executing.

        I can't really say any more than that without seeing more code - I'd suggest first to check for references to the data that could be keeping it defined in memory.

        Hope that helps ..
        -- Foxcub
        A friend is someone who can see straight through you, yet still enjoy the view. (Anon)

      I am not a guru on OSes but I didn't think that was the case, or services like ftp, apache and mysql could consume a system within hours. I wrote a very simple c++ program:
      #include <stdio.h> #include <stdlib.h> #include <unistd.h> void main (void ) { int *a[10000]; long c = 0; do { a[c] = new int[1000] ; } while ( c++ < 10001 ); c = 0; do { delete a[c]; } while ( c++ < 10001 ); sleep( 20 ); }
      the above program gives back the memory to the system, and top shows no abnormal memory usage. Removing the second loop results in an approxmiately 40MB memory usage.

      I tested this on Redhat 7.3 (2.4.18-3) gcc version 2.96 20000731

      edited: Mon May 5 14:02:53 2003 by jeffa - code tags

        services like ftp, apache and mysql could consume a system within hours

        Don't each of those re-exec themselves every time a request is made to them to stop that from happening? The way I understand it, an incoming request uses the existing server, and a second, new server process is forked off to await the next request. That initial process is then free to exit once the request is complete, releasing any resources it was holding. Admittedly, my OS knowledge is fairly basic (stuff from Uni, and from my own personal interest), but I'm sure I've read that somewhere *grins*.

        the above program gives back the memory to the system, and top shows no abnormal memory usage

        Gives memory back to the system how? Can you provide some specifics (numeric data as to what's going on, including the name of the field you're looking at in top?) A loop wouldn't (necessarily) use a large amount of memory in its own right; it'd be interesting to get a better idea of what's going on.

        -- Foxcub
        A friend is someone who can see straight through you, yet still enjoy the view. (Anon)

Re: Memory leak when using hash 'references'
by pg (Canon) on Mar 13, 2003 at 15:47 UTC

    Most likely this is not a memory leak. Memory leak means that, you allocated some memory, but lost hold of it, thus there is no way for you to free it up before the process itself is terminated. I would say what you see is actually a lazy garbage collector.

Re: Memory leak when using hash 'references'
by sureshr (Beadle) on Mar 13, 2003 at 14:40 UTC
    Additional info: I am using perl5.8 on Window 2000 and the task manager to view the memory usage (or is just a task manager bug, in the sense althought the process has freed up the memory it still does not show up the freed value). But I guess its not the case, as the memory still continues to grow, although I have freed the older hash references.
    -sureshr
Re: Memory leak when using hash 'references'
by nite_man (Deacon) on Mar 14, 2003 at 08:36 UTC
    As you know, Perl uses a garbage collector which based on controlling of refernces count (like in LISP). If the refereces count of something variable will be zero then the garbage collector will destroy this variable and free its memory.
    In your case, you have a hash and a reference on this hash. When your make undef of your hash, the hash reference is still existed and point on the memory area.
    Maybe this fact is a couse of the memory leak.
    I hope that I helped ... ---> SV* sv_bless(SV* sv, HV* stash);