Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Does Perl have a baked-in memory llimit?

by cormanaz (Deacon)
on Nov 11, 2023 at 16:56 UTC ( [id://11155561]=perlquestion: print w/replies, xml ) Need Help??

cormanaz has asked for the wisdom of the Perl Monks concerning the following question:

Good day monks. I have a half-gig file containing a JSON object I'm trying to read in using File::Slurp. I get an out of memory error. Thing is, I have 32 gig of installed memory, and only 20 gig is in use when I try to run the script. Should be plenty left to hold a half-gig string. Is there some absolute limit on memory for Perl?
  • Comment on Does Perl have a baked-in memory llimit?

Replies are listed 'Best First'.
Re: Does Perl have a baked-in memory llimit?
by eyepopslikeamosquito (Archbishop) on Nov 11, 2023 at 22:33 UTC

        Is that... Don't use File::Slurp for this use case or is it don't use File::Slurp EVER! ???

        For me, it's ever ... despite being good mates with File::Slurp's original author Uri Guttman. :)

        Though mostly a matter of taste for individuals working alone, dependencies are crucial when working in teams and especially for CPAN authors (such as the BOD :) because the price of depending on simple convenience modules, such as File::Slurp, is way too high:

        • What if your dependent module has a security vulnerability?
        • What if the author abandons it?
        • How quickly can you isolate/troubleshoot a bug in its code? (e.g. in a vital production system)

        It's a different story for more complex domains, such as DBI and XML, where it makes sense to leverage the work of experts in fields that you are not expert in.

        More detail on this topic can be found in the Dependencies section at Writing Solid CPAN Modules.

        👁️🍾👍🦟

        I don't use File::Slurp ever (but without the uppercase and the punctuation :-)

        Leon's blog post explains (some of) the problems with File::Slurp and at the end offers three alternatives. I tend to use Path::Tiny in preference because it isn't overkill, is already widely used and does just about anything I am likely to want in regards to files and directories, including slurping and spewing. YMMV.

        I should also say that for published code which only slurps one single file I would probably just do that without a module as the overhead isn't really worth it in terms of efficiency and maintenance.


        🦛

Re: Does Perl have a baked-in memory llimit?
by dave_the_m (Monsignor) on Nov 11, 2023 at 21:31 UTC
    Have you experimented with the buf_ref, scalar_ref and blk_size options to File::Slurp's read_file() method? In particular, with the default settings read_file() will keep reading 1Mb blocks from the file and appending it to a local variable. Each append may cause perl to reallocate the buffer used for the string, and depending on your OS's malloc() library, you may end up with allocated but not reused 1Mb, 2Mb, 3Mb, etc buffers. Try setting blk_size to the size of the file (possibly rounded up to a power of 2).

    When read_file() returns the string, the return value gets copied (possibly twice). Perl these days is fairly good at sharing or stealing buffers when copying strings, but buf_ref or scalar_ref may still help, depending on the circumstances and your perl version.

    Dave.

Re: Does Perl have a baked-in memory llimit?
by kevbot (Vicar) on Nov 11, 2023 at 17:18 UTC
Re: Does Perl have a baked-in memory llimit?
by ikegami (Patriarch) on Nov 15, 2023 at 04:08 UTC

    No.

    What's the output of perl -V:archname? (That's an uppercase "V".) I suspect you are using an environment with a 4 GiB address space, a huge chunk of which is reserved by the OS. If so, switch to a 64-bit build.

Re: Does Perl have a baked-in memory llimit?
by NERDVANA (Priest) on Nov 15, 2023 at 11:40 UTC
    Are you sure you're using a 64-bit perl, and not a 32-bit one left over from a previous system? a 32-bit perl could run out of ram pretty easily.
Re: Does Perl have a baked-in memory llimit?
by Anonymous Monk on Nov 12, 2023 at 17:48 UTC

    As another attempt for rational explanation (silly thing without code sample and error message), you may have inadvertently put call to slurp into list context, like:

    my $json = JSON->new->decode( read_file($file) );

    which then tries to split 5e8 bytes to presumably quite short lines, and may consume a few of 1e9 bytes of RAM long before JSON complains about wrong usage.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://11155561]
Approved by marto
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others romping around the Monastery: (3)
As of 2025-04-18 03:03 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.