|Problems? Is your data what you think it is?|
out of memory problem after undefby sduser81 (Novice)
|on Dec 03, 2008 at 00:25 UTC||Need Help??|
sduser81 has asked for the
wisdom of the Perl Monks concerning the following question:
I am trying to experiment with memory allocation in Perl and I hit a problem with Perl crashing out of memory.
I am using Win32 XP Pro.
Here is the code:
I am creating a large array of hashes, which goes up to around 1.8GB of memory usage. I then undef the array, and try to create another array that will hold 10M integers. Perl crashes after allocating around 4.2M array elements in the new array.
I thought that after I did undef, Perl could reuse the memory that was freed.
However, it looks like after freeing memory, the program still causes Perl to run out of memory.
Please advise me of what could be causing Perl to not properly reuse the freed memory.
I tried the same program on OSX, and it worked there and didn't crash.
This is just a test program that simulates behavior of a real script I have.
Thanks for your help.