I think you've done a fine job. However, there are numerous things in
your code that could be simpler. For starters, I would leave the
determination of the input and output files to the shell.
I.e. use @ARGV as the file list, and redirect output.
Those are things the shell is better at.
I'd also be wary of any solution that reads all that data into memory,
even if in small chunks.
Here's one way to do it that happens to make use of a standard module,
Tie::File, the doco of which says:
The file is not loaded into memory, so this will work even for gigantic files.
use Tie::File;
use strict;
use warnings;
my @files_as_arrays = map {
my @a;
tie @a, 'Tie::File', $_;
\@a
} @ARGV;
{
local( $,, $\ ) = ( "\t", "\n" );
my $i = 0;
while (1) {
my @a = map { $_->[$i] } @files_as_arrays;
$i++;
grep { defined $_ } @a or last;
no warnings;
print @a
}
}
Of course, an array of filehandles would work. It's a little bit simpler, too:
use IO::File;
use strict;
use warnings;
my @filehandles = map { new IO::File $_, 'r' } @ARGV;
while (1)
{
my @a = map { $_->getline } @filehandles;
grep { defined $_ } @a or last;
no warnings;
print join( "\t", @a ), "\n";
}
We're building the house of the future together.
|