Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things

Out of memory!

by jesuashok (Curate)
on Feb 07, 2007 at 04:23 UTC ( #598685=perlquestion: print w/replies, xml ) Need Help??
jesuashok has asked for the wisdom of the Perl Monks concerning the following question:

Dear monks,

I have a perl script which is trying to read some files and storing those lines into a hash. But each file size comes around 3MB or more. when I run the script I got the following error

[belief] /apps/inst2/metrica/analysis_ericsson/schema_analysis> perl s -r rename_columns -u /apps/inst2/metrica/anthony/Er +icson_R10_Onsite/VFOZ_BACKUP/summaryspr/ -y /apps/inst2/metrica/anth +ony/Ericson_R10_Onsite/VFOZ_BACKUP/metalayer/ \ > -o only_in_old_schema -p /apps/inst2/metrica/anthony/Ericson_R10_Ons +ite/VFOZ_BACKUP/reportspr/ Out of memory!
Is there any way, can I resolve this Issue. Is it possible to control the memory usage.

Replies are listed 'Best First'.
Re: Out of memory!
by Tanktalus (Canon) on Feb 07, 2007 at 04:55 UTC

    Generally speaking, when I've had that problem, it was that there was a memory limit via ulimit - removing that limit 'solved' the problem. At least, insofaras letting me use way more memory. Perhaps you have an underlying memory problem - wasting memory or leaking it, we can't be sure from your description. Assuming that not to be the case, though, it's probably a ulimit on memory.

Re: Out of memory!
by GrandFather (Sage) on Feb 07, 2007 at 04:29 UTC

    Is there any way you can post sample code that demonstrates the issue and indicate how many files are being manipulated?

    We need a little more information than "I have a problem with large hashes. How do I solve it?" if you want a better answer than "Use a tied hash" or "Install more memory".

    DWIM is Perl's answer to Gödel
Re: Out of memory!
by chargrill (Parson) on Feb 07, 2007 at 04:55 UTC

    "Doctor, it hurts when I do this!"

    "Then don't do that."

    So the answer is simple - don't write perl programs that consume more memory than your system can allocate.

    s**lil*; $*=join'',sort split q**; s;.*;grr; &&s+(.(.)).+$2$1+; $; = qq-$_-;s,.*,ahc,;$,.=chop for split q,,,reverse;print for($,,$;,$*,$/)
Re: Out of memory!
by quester (Vicar) on Feb 07, 2007 at 08:34 UTC
    Two other possible, somewhat painful, approaches:

    1. Step through the program with the debugger and check the memory size to see where you are in the code when it increases:

    $ perl -de0 Loading DB routines from version 1.28 Editor support available. Enter h or `h h' for help, or `man perldebug' for more help. main::(-e:1): 0 DB<1> !!ps v PID TTY STAT TIME MAJFL TRS DRS RSS %MEM COMMAND ... 4763 pts/1 S+ 0:00 0 1013 4874 4348 0.4 perl -de0 DB<2> @array=(1..1_000_000) DB<3> !!ps v PID TTY STAT TIME MAJFL TRS DRS RSS %MEM COMMAND ... 4763 pts/1 R+ 0:01 0 1013 103142 102400 9.8 perl -de0


    2. Take a look at "Debugging Perl memory usage" in perldebguts and see if using PERL_DEBUG_MSTATS and Devel::Peek::mstat would help.

    ...but first look at the output of perl -V and see if it says "usemymalloc=n", in which case you won't have any memory statistics... :-(

Re: Out of memory!
by siva kumar (Pilgrim) on Feb 07, 2007 at 06:43 UTC
    Ashok, Please check ulimit -a , the sample output looks below.
    core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited pending signals (-i) 1024 max locked memory (kbytes, -l) 32 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 3959 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited
    Verify whether virtual memory/file size is limited?. Otherwise check your code for leaking of memory.
Re: Out of memory!
by swampyankee (Parson) on Feb 07, 2007 at 14:29 UTC

    Look into the method you're using to process the files. For example, do you really need to store the entire file into the hash? From your description, it seems you may be doing that. Alternatively, if you need the contents of all the files, you could try tying the hash to a database.

    Without some more information about what you are trying to accomplish ("read some files and storing those lines into a hash" is not very descriptive), it will be impossible to give anything other than quite generic advice.


    Insisting on perfect safety is for people who don't have the balls to live in the real world.

    —Mary Shafer, NASA Dryden Flight Research Center

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://598685]
Approved by GrandFather
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (4)
As of 2019-02-16 09:39 GMT
Find Nodes?
    Voting Booth?
    I use postfix dereferencing ...

    Results (95 votes). Check out past polls.