in reply to A very odd happening (at least. . . to me)
in thread Processing large files many times over
From a quick glance at the code, one of the first questions that comes to mind is "how many files are in these directories?" I suspect that part of the source of your slowness is that you read both the entire list of files, and the entire contents of each file into memory. If you alter your reading structure like so:
you won't have the overhead of all the memory allocation. In your second example there's a system call to a secondary perl script. That's going to be time consuming too. Consider making the second perl program a subroutine...that will avoid a fork, exec, and compile for every file you have.open(DIR,"$base_dir\\$dir") or die "$dir failed to open: $!"; while (my $file = readdir(DIR)) { next unless $file =~ /\.txt$/; # etc, etc. open(IN,"$full_name") || die "can't open $!"; while (my $line = <IN>) { # processing } close(IN); } closedir(DIR);
HTH
/\/\averick
OmG! They killed tilly! You *bleep*!!
|
---|
In Section
Seekers of Perl Wisdom