http://www.perlmonks.org?node_id=727530

sduser81 has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

I am trying to experiment with memory allocation in Perl and I hit a problem with Perl crashing out of memory.

I am using Win32 XP Pro.

Here is the code:

#!/usr/bin/perl use FindBin qw($Bin); use lib "$Bin/"; use strict; my @idsMap; my @h_item_selector; my $cnt = 1; for(my $i = 1; $i < 50; $i++) { my $start = "aaa"; for(my $j = 1; $j < 250000; $j++) { my $val = "$cnt $i $j $start"; $h_item_selector[$i] -> {$val} = $cnt; $cnt++; $start++; } } undef (@h_item_selector); print "did undef of array\n"; sleep 10; print "alocating string array\n"; my $g_freeSelectorIndex = 10000000; for(my $i = 0; $i < $g_freeSelectorIndex;$i++) { if( ($i % 10000) == 0) { print "Processing $i element\n"; } $idsMap[$i] = -1; if( ($i % 10000) == 0) { print "Processed $i element\n"; } }
I am creating a large array of hashes, which goes up to around 1.8GB of memory usage. I then undef the array, and try to create another array that will hold 10M integers. Perl crashes after allocating around 4.2M array elements in the new array.
I thought that after I did undef, Perl could reuse the memory that was freed.
However, it looks like after freeing memory, the program still causes Perl to run out of memory.

Please advise me of what could be causing Perl to not properly reuse the freed memory.

I tried the same program on OSX, and it worked there and didn't crash.

This is just a test program that simulates behavior of a real script I have.

Thanks for your help.

Regards,


Tim

Replies are listed 'Best First'.
Re: out of memory problem after undef
by BrowserUk (Patriarch) on Dec 03, 2008 at 10:15 UTC

    The problem is that although the ~1.5 GB required by the AoHs is released back to the memory pool when you undef it, it isn't in a form that allows it to be completely re-usable by the subsequent big array.

    Perl's arrays require a single, contiguous chunk of memory for the base (AV) allocation--the (C-style) array of pointers to SVs, and for your large array that means a single chunk of 10e6 * 4-byes or ~40MB.

    Whilst that pales into insignificance relative to the memory previously used and freed from the AoHs, that memory was internally fragmented into much smaller chunks by the creation of the AoHs and is not reconsituted into contiguous lumps. So, when constructing the large hash, the runtime has no choice but to go back to the OS for another lump of virtual memory and that pushes you over the edge. What's worse is that it doesn't go back for one 40MB chunk, but rather has to go back for several large chunks as the array doubles and re-doubles in size as you construct it.

    You might find that if you pre-size the array, after undefing the AoH:

    ... undef (@h_item_selector); print "did undef of array\n"; sleep 10; print "alocating string array\n"; my $g_freeSelectorIndex = 10000000 $#idsMap = $g_freeSelectorIndex; ### pre-allocate to final si +ze ...

    that you avoid the intermediate doubling allocations, and so avoid pushing things over the edge.

    Prior to needing the 40MB contiguous chunk, it has a 20MB chunk. And when that fills to capacity, it needs both concurrently in order to copy the old to the new. And the same was previously required at 10MB and 5MB etc. By preallocating to the final size in one step, all those previous intermediate stages can be avoided and that may allow your process to complete.

    You might also avoid some accumulated large, but not large enough for re-use allocations by pre-allocating your hashes to their final size:

    for(my $i = 1; $i < 50; $i++) { my $start = "aaa"; my %hash; keys %hash = 250_000; ### pre-allocate the hash for(my $j = 1; $j < 250000; $j++) { my $val = "$cnt $i $j $start"; $hash{$val} = $cnt; ### Simple hash assignments $cnt++; $start++; } $h_item_selector[$i] = \%hash; ### then assign a reference }

    In addition, avoiding compound dereferencing for the assignments might speed up the construction a little.

    (BTW: What is it with all those variable names? A weird mixture of lower_case_with_underscores, CamelCase and a_MixtureOfTheTwo. Just horrible to work with.)


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
      Hi,

      Thanks for the reply.

      Regarding allocation of memory for the AoH, I currently don't see a way of making that more efficient, because in context of the script where this is used, this structure is built up over time.

      Also, this seems like a fundamental problem with using such structures in Perl on Win32. By the way, I am using ActiveState ActivePerl distribution, if this makes a difference.

      Based on what you said, because I have this huge AoH, memory is fragmented in small chunks and it can't be easily reused for a large array. Do you, or anyone else, know if there is a way to force Perl to defragment its memory pool to enable reallocation of the freed memory?

      I tried to preallocate the array, but there is not enough system memory, so Perl runs out of memory.

      Thanks,


      Tim
        Hey All, I too ran into more or less the same issue as did Tim, so does anybody have an answer for suggesting perl to defragment its memory pool ? Its really bad and frustrating coz despite having a memory of 2GB eaten up by Perl it is not able to meet the need of storing 200MB of data. Big Monks - Please help me out....
Re: out of memory problem after undef
by GrandFather (Saint) on Dec 03, 2008 at 01:12 UTC

    You are bumping up really close to the 2 GB virtual memory limit for 32 bit Windows. It may be that you are actually running out of memory. It is quite possible that OSX's virtual memory limit is defferent - 4 GB for 32 bit hardware and untold for 64 bit.


    Perl's payment curve coincides with its learning curve.
      I'm no good with memory issues, so this may be a stupid question, but: Is it possible that the grandparent (as opposed to the GrandFather) just didn't realise that it's the operating system, not perl itself, that's running out of free memory (since—I think—perl doesn't release memory back to the OS until it exits)?

        The OP is, in effect, making two allocations with a free between them. The first allocation is a very large AOH that causes Perl to allocate very near 2 GB. That allocation is then "freed" by undefing the array containing the hashes. Then a largish array is allocated containing integers. It is in the second phase that I infer things are going pear shaped for the OP, although I can't reproduce that behavior.

        Perl doesn't generally free memory back to the system, but the space released following the first large allocation should be available for the second allocation however.


        Perl's payment curve coincides with its learning curve.
Re: out of memory problem after undef
by zentara (Archbishop) on Dec 03, 2008 at 19:45 UTC
    I can't say how it would work on win32, but on linux, I've had luck with Perl releasing huge memory back to the system, if the array was in a thread, and the thread was joined/finished.
    #!/usr/bin/perl use threads; use threads::shared; use warnings; use strict; print $$,"\n"; # top -p $$ to check my $alive:shared = 0; my $thr = threads->create(\&display); print "check mem use, then hit any key\n"; <>; #check mem use $alive = 1; $thr->join; print "check mem use, then hit any key\n"; <>; #check mem use $alive = 0; my $thr1 = threads->create(\&display); print "check mem use, then hit any key\n"; <>; #check mem use $alive = 1; $thr1->join; print "check mem use, then hit any key to finally exit\n"; <>; #check mem use sub display { my @array; foreach (1..10000000){ push @array, 'aaaa'; } while(1){ last if $alive; sleep 1; } undef @array; return; }

    I'm not really a human, but I play one on earth Remember How Lucky You Are

      I can't say how it would work on win32, but on linux, I've had luck with Perl releasing huge memory back to the system

      Win32 does release back to the OS, but we're not talking about releasing back to the system. Releasing back to Perl is fine.

Re: out of memory problem after undef
by ikegami (Patriarch) on Dec 03, 2008 at 20:33 UTC

    I thought that after I did undef, Perl could reuse the memory that was freed.

    It does.

    However, it looks like after freeing memory, the program still causes Perl to run out of memory.

    Because the second structure uses more memory (to build if not when complete) than the first.

      Because the second structure uses more memory (to build if not when complete) than the first.

      Not so!

      The AoH with 50 hashes each with 250_000 key/value pairs requires close to 1 GB, when finished and over 1.5GB to construct. Whereas the array of 10e6 integers requires just 220MB, and a peak of 330MB to construct.

      The problem is that none of the memory freed up by releasing the AoH is large enough to be reused for the AV for the array, because Perl doesn't defragment or coalese freed memory. So, even though Perl has gobs of free space available to it, it has to go back to the OS for more in order to get big enough chunks for the array.


      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.

        Oops, I missed the outer 1..50 loop!

        I wonder how hard it would be to provide a function that defragments the heap, and what would be the downsides of calling it when large blocks are needed from the OS.