http://www.perlmonks.org?node_id=1017136


in reply to Using piped I/O?

It is almost always a mistake to implement some directory searching on your own. Great directory searching tools already exist, e.g. ack and find, and your built-in one will be buggy, feature poor and barely usable.

ack -ag .pep$ | xargs your-program find -iname *.pep | xargs your-program

Seperate your concerns. Just write your program to deal with file names as command-line arguments, see @ARGV, and leave the searching to someone else.

You do not need pipes (in your program). You do not need arrays. Since you say the files are pretty large, go over them line-by-line to save on memory. Stuffing a whole file into an array would be harmful because you occupy as much memory as the file is large.

use 5.010; use autodie qw(:all); for my $file (@ARGV) { say "Now processing file '$file'"; open my $handle, '<:raw', $file; while (my $line = readline $handle) { # do something with the line here. } close $handle; }

Likely you want to include Getopt::Long to process additional command-line arguments, and Pod::Usage (see chapter "Recommended Use") to write some nice documentation.$file