Thanks graff, this is a clever solution. I'm all up for looking at alternatives. Am looking into this now. Likewise I'm also tending to avoid the File::Find module. I'm having to rewrite my program now as the use of the File::Find module was at the heart of it, and its rendered my program obsolete as a practical solution due to the sheer size of our file server.
Thanks mate
Dean | [reply] |
Just a thought about something you might try... This works for me under unix, and I expect it would work in windows as well. It's very good in terms of using minimal memory, and having fairly low system overhead overall:
chdir $toppath or die "can't cd to $toppath: $!";
open( FIND, "find . -type d |" ) or die "can't run find: $!";
while ( my $d = <FIND> ) {
chomp $d;
unless ( opendir( D, $d )) {
warn "$toppath/$d: open failed: $!\n";
next;
}
while ( my $f = readdir( D )) {
next if ( -d "$d/$f" ); # outer while loop will handle all dir
+s
# do what needs to be done with data files
}
# anything else we need to do while in this directory
}
close FIND;
This has the nice property that all the tricky recursion stuff is handled by "find", while all the logic-instensive, file-based stuff is handled pretty easily by perl, working with just the data files in a single directory at any one time. | [reply] [d/l] |