Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical

Storable, Cygwin, and memory issues

by dpuu (Chaplain)
on Apr 06, 2006 at 20:57 UTC ( #541732=perlquestion: print w/ replies, xml ) Need Help??
dpuu has asked for the wisdom of the Perl Monks concerning the following question:

I was doing some refactoring: The original script combined a parser module that created a big datastructure, and the na bunch of backend modules that dumped out the data in various formats. I restructured this into multiple scripts: The "parser" script would parse the input files, and then dump the datastructure to disk using Storable::nstore_fd. The backend scripts would then read this (using fd_retrieve) and then do their backend stuff. This new architecture was generally considered to be an improvement.

But then I started getting bug reports from cygwin users. The backend scripts were dying with "Out of memory during ridiculously large request at ../../lib/ ..." errors. This errors appear even then I set HKEY_CURRENT_USER\Software\Cygnus Solutions\Cygwin\heap_chunk_in_mb=1024 to tell cygwin to allow the process to use a full gigabyte of memory.

I did some profiling (a binary search "limit vmemoryuse ..."). On linux I determined the memory consumption to be 43 MBytes. Hardly excessive (the file dumped by Storable is 10M). Replacing Storable with the original parser code reduced it to 42MB ... and this fixed the cygwin issues. A different dataset goes from 39M to 38M, which refutes a suggestion that that the problem is that we are right on the edge of some limit.

So my question is, are there any known issues with the way that Storable works on Cygwin that would cause this excessive memory use? I'm considering switching to Sqlite for the intermediate file, but I'd like to understand the problem before doing the work.

And another question, are there any good memory profiling tools for perl? I use -d:DProf for speed profiling, but for these types of issues I need to know where my memory is being used.

Opinions my own; statements of fact may be in error.

Comment on Storable, Cygwin, and memory issues
Re: Storable, Cygwin, and memory issues
by vkon (Deacon) on Apr 06, 2006 at 21:47 UTC
    Of course few tens of megabytes should be processed okay.
    I used to operate this way Archive::Zip object of size 35Mb, which was done for archive some 400Mbytes, in order to save archive creation time -- without any problems.

    You didn't provided many detailed version information, so I guess on what I saw in your post.
    As long as you mentioned "cygnus" I assume this cygwin is rather old, and it could be that its default mounts are text (as opposed to binary), so this could lead to broken numbers and thus huge memory allocations.

    As for memory profiling - I use very primitive but quite handy way to see for memory consumptions. near 1Gb -- just ordinary task manager shows eaten memory just fine.
    Otherwise Devel::Peek for precise analyzis, but this requires some programming.


      Thanks for the reply.

      Yes, it is quite an old cygwin: 1.3.22, running perl 5.8.0 with Storable 2.04. The mount points are binmode.

      Opinions my own; statements of fact may be in error.
        ... not the oldest cygwin ever :):)
        Did you succeeded narrowing down a problem to some reproducable test case?

        (sorry for being trivial:)

Re: Storable, Cygwin, and memory issues
by roboticus (Canon) on Apr 06, 2006 at 21:56 UTC
    I use cygwin every day (and really love it). I used to run out of memory all the time because the default limit isn't all that much, considering...

    From the user's manual:

    Changing Cygwin’s Maximum Memory

    By default no Cygwin program can allocate more than 384 MB of memory (program+data). You should not need to change this default in most circumstances. However, if you need to use more real or virtual memory in your machine you may add an entry in the either the HKEY_LOCAL_MACHINE (to change the limit for all users) or HKEY_CURRENT_USER (for just the current user) section of the registry.

    Add the DWORD value heap_chunk_in_mb and set it to the desired memory limit in decimal MB. It is preferred to do this in Cygwin using the regtool program included in the Cygwin package. (For more information about regtool or the other Cygwin utilities, see the Section called Cygwin Utilities in Chapter 3 or use each the --help option of each util.) You should always be careful when using regtool since damaging your system registry can result in an unusable system. This example sets memory limit to 1024 MB:

    regtool -i set /HKLM/Software/Cygnus\ Solutions/Cygwin/heap_chunk_in_m +b 1024 regtool -v list /HKLM/Software/Cygnus\ Solutions/Cygwin


      Yes, I tried that: it still fails even when told to use the full Gig. Even if that was the problem, the fact the same script (with the same data) on Linux requires only 43 MBytes would suggest something is wrong (or at least different) on Cygwin.
      Opinions my own; statements of fact may be in error.

        In my rush to actually try to contribute something useful, I failed to fully read your original post, and totally missed the part where you upped the memory limit. My earlier comment should be downvoted. Sorry about that.

        Unfortunately, I have no insight into why you're running out of memory...


Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://541732]
Approved by Tanktalus
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others avoiding work at the Monastery: (19)
As of 2014-08-22 19:01 GMT
Find Nodes?
    Voting Booth?

    The best computer themed movie is:

    Results (163 votes), past polls