Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number

Return the contents of a file

by jamescohen (Initiate)
on Feb 04, 2000 at 23:02 UTC ( #2871=snippet: print w/replies, xml ) Need Help??
Description: I think that both of these use the minimum amount of code. (doing the job properly - using my and close)
Post if you have a better way!
Return the file contents as a scalar value
sub readfile {
    my $OP ;
    open FS, $_[0] ;
    while (<FS>) {$OP .= $_}
    close FS ;
    return $OP ;

Returns the contents of a file in list form (line by line)
sub readlines {
    open FS, $_[0] ;
    my (@OP) = <FS> ;
    close FS ;
    return (@OP) ;
Replies are listed 'Best First'.
RE: Return the contents of a file
by btrott (Parson) on Feb 04, 2000 at 23:31 UTC
    Two things:

    1) that while loop should read:

    while (<FS>) { $OP .= $_ }
    Note the all-important ".=" instead of "=" (probably just a mistype on your part. :)

    2) your opens should always check for error messages, and you should get faster results slurping the whole file in as one string, like so:

    sub readfile { my $OP; open FS, $_[0] or die "Can't open $_[0]: $!"; { local $/ = undef; $OP = <FS>; } close FS or die "Can't close $_[0]: $!"; return $OP; }

      Here's another version:

      sub readfile { local $/="" unless wantarray; return open(FS, shift) ? <FS> : undef; }
      No 'my' variables, handles both string and array requests, and returns undef if the file could not be opened. Use it like this:

      defined($slurp = &readfile($file)) or die "Could not open $file: $!\n" +; print "Scalar 'slurp' now has ", length $slurp, " characters in it\n"; defined(@slurp = &readfile($file)) or die "Could not open $file: $!\n" +; print "Array 'slurp' now has ", $#slurp, " elements in it.\n";
        Just loves to see this.
      I REALLY do like this one. I've found that it works great for small files. Has anyone had any experiences using this technique w/ something large like a *.cvs file from a database dump? Are there any memory issues to handling a variable that large. -- I shouldn't thinks so. Hmmm...
        There *are* memory issues in having very large variables, just as there would be, presumably, in any language. Perl variables take up a lot of space. From perlfaq3:
        When it comes to time-space tradeoffs, Perl nearly always prefers to throw memory at a problem. Scalars in Perl use more memory than strings in C, arrays take more that, and hashes use even more.
        I mean, if you *can* process a file line by line, it's best to do so.
RE: Return the contents of a file
by chromatic (Archbishop) on Apr 12, 2000 at 18:43 UTC
    You can also redefine the input record separator $/ to a null string:
    local $/; my $scalar = <FH>;
    That obviates the need for a while loop. Normally, $/ is set to a newline. That's why the magic <FILEHANDLE> reads one line at a time. Using a local $/ will change that to whatever separator you want. Specify none, slurp the whole thing. That could make your first example even more compact.
Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: snippet [id://2871]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others perusing the Monastery: (3)
As of 2023-06-05 02:55 GMT
Find Nodes?
    Voting Booth?
    How often do you go to conferences?

    Results (22 votes). Check out past polls.