Esteemed monks,
I am trying to create code which will recursively sync files between two directories which are given on the command line. I would like the code to look through each directory, tell me which files are common and which are not, then sync (copy to whichever directory is file deficient) any of the files seen only once. In my novice and modest efforts, I have already come up with:
#!/usr/bin/perl -w
# use pragmas
#####
# use diagnostics;
use strict;
# use modules
#####
use File::Find;
use File::Basename;
# Declare variables;
#####
my $file;
my %count;
my @total_files;
# Make sure command line argument is supplied
#####
die "No command line arguments: $!\n" unless @ARGV;
# wanted subroutine for File::Find
#####
sub wanted {
$file = $File::Find::name;
if (basename($file) =~ /^\.$|^\.\.$|^\.DS/) { next; } # .DS_ f
+or Mac OS parse
push(@total_files, $file = basename($file));
}
# process @ARGV
#####
find(\&wanted, @ARGV);
# create hash for basename file seen count
#####
foreach $file(@total_files) {
$count{$file} += 1;
}
# print to STDOUT any file not seen 2 or more times
#####
foreach $file (keys %count) {
if($count{$file} < 2) {
print "SYNC this file: $file \n";
$file is only the basename. When I use the full path returned by File::Find, it doesn't meet my hash requirements, because each path is different (as I am sure you aready know..)
}
}
# for STDOUT format cleanliness
#####
print "\n";
I am not sure how to identify by path each file that needs to go from the directory in which it exists to the directory in which it does not exist. Could you please steer me in a direction which would cause the code to "smartly" identify and sync each file which does not already exist in each location simultaneously?
Thank you for any aid.