http://www.perlmonks.org?node_id=1018723

shan_emails has asked for the wisdom of the Perl Monks concerning the following question:

Hi All

I need to read the contents of the file (2 GB file size) in unix box and then assign into a variable. That time it shows Out of memory error. How to handle this scenario

bash-2.00$ uname HP-UX bash-2.00$ ulimit -a core file size (blocks) 2097151 data seg size (kbytes) 1048576 file size (blocks) unlimited max memory size (kbytes) unlimited open files 2048 pipe size (512 bytes) 16 stack size (kbytes) 256000 cpu time (seconds) unlimited max user processes 5001 virtual memory (kbytes) unlimited

And my code is

my $contents; open(my $fh, '<', 'control_report_file') or die "cannot open file cont +rol_report_file $!"; { local $/; $contents = <$fh>; } close($fh);
bash-2.00$ perl test.pl Out of memory!

Please, some help would be much obliged!

Thanks

Shanmugam A.

Replies are listed 'Best First'.
Re: out of memory
by choroba (Cardinal) on Feb 14, 2013 at 11:46 UTC
    Why do you need to keep the whole file in memory? Cannot you process the file in chunks? This seems like an XY Problem: What are you trying to achieve?
    لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ

      I need to put that content into database table

      is any other way to handle this scenario

        When loading data into a database, I recommend using the bulk loading tools available with the database. The easiest approach is to use Perl to write the available data to a new file in a format that is suitable to the bulk loading tool, either using fixed width or delimited rows of text, or SQL statements.

        If you are adding more than one row, you can probably process the file chunk by chunk and insert corresponding rows to the database one by one (or create a file the database can bulk-load). See also DBI.
        لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ

        You haven't provided some example file content. Essentially read the file one record at a time, insert the record into the database. This is of course a simplistic approach, and is perhaps not best suited to your system (number of records, database type etc) so creating a load file based upon your input may be a better idea.

Re: out of memory
by marto (Cardinal) on Feb 14, 2013 at 11:56 UTC
Re: out of memory
by jeffenstein (Hermit) on Feb 15, 2013 at 16:58 UTC
    On HP-UX, 32 bit processes can only use about 600MB of memory for data because of the HP-UX process memory layout. In theory, it's 1GB of memory for data, but I've never seen it actually happen in practice.

    If you really need to load it all into memory, you'll need to use a 64 bit perl, which can be found in /opt/perl_64/bin, if the system administrator has installed the perl supplied by HP.
Re: out of memory
by necroshine (Acolyte) on Feb 14, 2013 at 17:54 UTC
    Necroshine
    Why can´t you do like this:
    open(my $fh, '<', 'control_report_file') or die "blabla $!"; while(<$fh>){ my $content = $_; chomp $_; #Do whatever you need } close $fh; #hey look... I have memory yet...
Re: out of memory
by Anonymous Monk on Feb 14, 2013 at 23:46 UTC
    Bulk-loading is nothing new. Most SQL clients know how to read a file directly and stuff it into a database row ... and to deal with blobs much larger than anyone's memory-size. You don't need to attempt to do this "this way." It's not a new requirement ... you're just barking up the wrong tree right now.