I REALLY do like this one. I've found that it works great
for small files. Has anyone had any experiences using this
technique w/ something large like a *.cvs file from a database
dump? Are there any memory issues to handling a variable that
large. -- I shouldn't thinks so. Hmmm...
No 'my' variables, handles both string and
array requests, and returns undef if the file
could not be opened. Use it like this:
defined($slurp = &readfile($file)) or die "Could not open $file: $!\n"
print "Scalar 'slurp' now has ", length $slurp, " characters in it\n";
defined(@slurp = &readfile($file)) or die "Could not open $file: $!\n"
print "Array 'slurp' now has ", $#slurp, " elements in it.\n";
That obviates the need for a while loop. Normally, $/ is set to a newline. That's why the magic <FILEHANDLE> reads one line at a time. Using a local $/ will change that to whatever separator you want. Specify none, slurp the whole thing. That could make your first example even more compact.