Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I've written a script that creates a few hashes which use ~5.4GB of RAM. The script seems to run to completion within about 15 minutes, and prints out a final "I'm done" line right before a final exit; statement. However, the script continues to run (100% processor) for another ~80-90 minutes before terminating.

Can anyone clue me in as to why this might be happening?

I've tried setting all the hashes to null, but that doesn't seem to free up any of the memory. I'm running this on a 64-bit machine with 16GB RAM.

Let me know if there is any other info I can provide. Thanks in advance for the help!

  • Comment on High RAM script takes forever to terminate

Replies are listed 'Best First'.
Re: High RAM script takes forever to terminate
by Fletch (Bishop) on Nov 19, 2008 at 21:47 UTC

    The OS this is happening on would be helpful, but first impression is that you're not helping by explicitly trying to explicitly undef or otherwise zero out your data structures. On most OSen it's not possible to return memory from a running process (Windows and I believe several flavours of *BSD can) short of the process terminating. If anything you're probably making the underlying malloc/free machinery do extra work putting all that memory back into its free lists for nothing (since you're already exiting it's not going to be reused by the current process).

    A utility like strace/truss/ktrace might be of use to see what sort of system calls are being made during the period after you've started exiting (if any, which may shed light on what's going on and leaving you hung).

    Update: Expanded with parenthetical in second paragraph.

    The cake is a lie.
    The cake is a lie.
    The cake is a lie.

      Currently using CentOS 5.

      The malloc issue you raise makes a lot of sense. I'll see if dropping those calls helps anything. Also, I'll take a look at those utilities and see what I find.

Re: High RAM script takes forever to terminate
by grinder (Bishop) on Nov 19, 2008 at 22:16 UTC

    I think you're probably noticing the cost of global destruction. At the very end of the program's execution, the interpreter will go through every single allocation and see if it needs to be DESTROYed. See perlobj, especially the bit at the end.

    I don't think you can switch this off. If anything, you can only make it worse :) A perl binary built with -DDEBUGGING will let you set the PERL_DESTRUCT_LEVEL to increasing values, which will exercise the garbage collector more and more severely. This is used to smoke out problems involving memory leaks.

    One avenue worth exploring is whether in fact you are leaking a colossal amount of memory during the program's execution, thus leaving 5.4Gb to be cleaned up at the end. Remove the leak, and maybe global destruction will become much cheaper.

    • another intruder with the mooring in the heart of the Perl

Re: High RAM script takes forever to terminate
by ikegami (Patriarch) on Nov 19, 2008 at 21:57 UTC
    A hackish solution would be to call _exit (in POSIX).
Re: High RAM script takes forever to terminate
by gwadej (Chaplain) on Nov 19, 2008 at 23:26 UTC

    I was doing some profiling of different ways to clear a hash a short while back and ran across an interesting data point.

    %hash = ();

    was significantly faster than letting the hash go out of scope or using delete.

    I don' know if this will help. (I suspect it won't.) But, it's worth a try.

    G. Wade
Re: High RAM script takes forever to terminate
by toma (Vicar) on Nov 20, 2008 at 07:25 UTC
    This is a swift exit that works great on linux and probably others:
    `kill -9 $$`;
    It should work perfectly the first time! - toma
      Ha! Hilarious, this worked great. Just had to smack it with the right kind of hammer to end the incessant thrashing.