in reply to Open multiple file handles?
As suggested in the preceding replies, you've grabbed the wrong end of the stick and run with it. A better solution (for modest size data sets anyway) than the one you are struggling to implement is to read the files one at a time and merge the results in memory. Consider:
#!/usr/bin/perl use warnings; use strict; # First set up the sample files my @fileContents = ('a 1 b 2 c 3', 'a 5 d 3', 'a 1 x 4'); @ARGV = (); for (my $fileNum = 1; @fileContents; ++$fileNum) { my $fileName = "file$fileNum.txt"; open my $fileOut, '>', $fileName or die "Failed to create $fileNam +e: $!\n"; push @ARGV, $fileName; print {$fileOut} shift @fileContents; } # Now for the "real" code my %data; my $maxFile = @ARGV - 1; while (<>) { my %newData = split; $data{$_}[$maxFile - @ARGV] = $newData{$_} for keys %newData; } for my $key (sort keys %data) { $data{$key}[$maxFile] ||= 0; $_ ||= 0 for @{$data{$key}}; print "$key @{$data{$key}}\n"; }
Prints:
a 1 5 1 b 2 0 0 c 3 0 0 d 0 3 0 x 0 0 4
Note that most of the "tricky" code is to deal with getting the output data in the required format accounting for "missing" elements.
True laziness is hard work
In Section
Seekers of Perl Wisdom