Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Out of memory problem when copying contents of file into array

by junky123 (Acolyte)
on Feb 18, 2005 at 15:23 UTC ( [id://432357]=perlquestion: print w/replies, xml ) Need Help??

junky123 has asked for the wisdom of the Perl Monks concerning the following question:

Hello Friends,

I need to get the last 100 lines of a log file and display it in a text area. For this I use a filehandle to open the file and then copy the contents of the file to an array. I get the total number of lines, then display the last 100 lines. When I run this, I get an out of memory message in the textarea.

Below is the code:

$lineno=100; #Opening file open(FILE, "$file") or die "Can't find $file \n"; #Transfer contents of file to array @lines = <FILE>; #Count number of lines in the file $num = @lines; for (; $lineno > 0; $lineno--) { #Get last lines from the file and print them @tail = @lines[$num - $lineno]; print(@tail); } close(FILE);
Thanks

20050218 Janitored by Corion: Added formatting

Replies are listed 'Best First'.
Re: Out of memory problem when copying contents of file into array
by husker (Chaplain) on Feb 18, 2005 at 15:36 UTC
Re: Out of memory problem when copying contents of file into array
by TedYoung (Deacon) on Feb 18, 2005 at 15:38 UTC

    Below, I use an array to keep tract of the last 100 lines read from the file. This keeps you from having to read the entire file into memory, giving you an out of memory error.

    sub tailFile { my ($file, $lines) = @_; open F, $file or die $!; my @lines; while (<F>) { push @lines, $_; shift @lines if @lines > $lines; } close F; return @lines }

    Update: As suggested, you may want to check out File::ReadBackwards. After all, you should always re-use a well developed and test module over re-inventing the wheel. But, atleast my code give you one way of doing it.

    Ted Young

    ($$<<$$=>$$<=>$$<=$$>>$$) always returns 1. :-)
Re: Out of memory problem when copying contents of file into array
by ZlR (Chaplain) on Feb 18, 2005 at 15:39 UTC
    Hello,

    Although I never used it myself I remember reading about File::ReadBackwards

    It sounds like you could use it :)

    ZlR

Re: Out of memory problem when copying contents of file into array (Benchmarks)
by bpphillips (Friar) on Feb 18, 2005 at 16:41 UTC
    Here's the benchmarks on a 8.6M, 156896 line file
                          Rate TedYoung   OP's RandomWalk `tail -100` IO::All->backwards File::ReadBackwards
    TedYoung            1.37/s       --   -14%       -39%        -99%               -99%               -100%
    OP's                1.60/s      17%     --       -29%        -99%               -99%               -100%
    RandomWalk          2.26/s      64%    41%         --        -99%               -99%               -100%
    `tail -100`          157/s   11309%  9679%      6841%          --               -37%                -82%
    IO::All->backwards   251/s   18145% 15538%     10999%         60%                 --                -71%
    File::ReadBackwards  868/s   63105% 54075%     38349%        454%               246%                  --
    
    Update: It's interesting to note that on a much smaller file (47Kb in my case), tail occassionally wins over File::ReadBackwards and IO::All but those three consistently outperform the others.

    Update (again): Didn't look at the OP's code well enough when I made the benchmark. Fixed it so that it works comparitively to the other examples. Also removed the split() from the `tail -100` which (much to my surprise) isn't necessary. Neither of these things appear to have affected the speed comparisons in any measurable way...

    The code I used to do the benchmark:
      'OP\'s' => sub { my $lineno = 100; open( FILE, "$f" ) or die "Can't find $f\n"; my @lines = <FILE>; my $num = @lines; for ( ; $lineno > 0 ; $lineno-- ) { my @tail = @lines[ $num - $lineno ]; } close(FILE); },
      This can't be correct. You never create an array with 100 lines. You do create 100 arrays with one line each, using a single element slice ('use warnings' complains).
      '`tail -100`' => sub { my @lines = split( /\n/, `tail -100 $f` ); },
      Why the explicite splitting? Why not just
      my @lines = `tail -100 $f`;
      Not that it makes a huge difference speedwise.
        I didn't write it... I just copied it from the OP's post. On even a cursory look, it's obviously broken but I didn't do that when I was setting up the Benchmark. My mistake.
Re: Out of memory problem when copying contents of file into array
by jbrugger (Parson) on Feb 18, 2005 at 15:31 UTC
    You could try it this way: (for large files)
    ** Updated to id::1382's suggestion... (is that documented somewhere? ... ***
    sub perform { # do per line processing to avoid out-of-memory error. open( FILE, "< ./file.txt" ) or die "Can't open : $!"; while (my $line = <FILE>) { processLine(chomp($line)); } close FILE; } sub processLine { my $line=shift; # etc.

      Unfortunately, for loops build a list out of their contents, which means that the code above will read the entire file into memory. while is the right way to process a line at a time.

Re: Out of memory problem when copying contents of file into array
by Random_Walk (Prior) on Feb 18, 2005 at 16:20 UTC

    Here is a version based on TedYoung's but using a circular buffer which I think is a little more efficient. Need to find a really big file to do some benchmarks.

    #!/usr/bin/perl use strict; use warnings; sub tailFile { my ($file, $lines) = @_; open F, $file or die $!; my @lines; $#lines=($lines-1); my $i; while (<F>) { $lines[$i++] = $_; $i = 0 if $i == $lines; } close F; # this works but not sure if this is safe using pre-dec # and var on same line see perlmonks nodes passim return @lines[$i--..$#lines, 0..$i] } print (tailFile ("/etc/passwd", 10));

    Cheers,
    R.

    Pereant, qui ante nos nostra dixerunt!
Re: Out of memory problem when copying contents of file into array
by Miguel (Friar) on Feb 18, 2005 at 16:35 UTC
    Another alternative:
    #!/usr/bin/perl -w use strict; use IO::All; my @reversed_io = io("file.txt")->backwards->getlines; my @lines; push @lines, $reversed_io[$_] for ( 0 .. 100 );
Re: Out of memory problem when copying contents of file into array
by Anonymous Monk on Feb 18, 2005 at 16:15 UTC
    system "tail", "-$lineno", $file;
    HTH. HAND.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://432357]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (4)
As of 2025-07-15 16:24 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.