in reply to File::Copy::Recursive - current file/directory being processed

Even getting it to print the dots, as in an earlier post, was an abuse of the module. It was a trick that worked for that purpose because it turns out that dircopy uses fcopy, so even though dircopy doesn't expose its recursive method, overriding fcopy was a way to hook into the module's mechanics.

Now you're trying to bolt even more functionality into a module that really wasn't designed with extensibility in mind. Your best hope is to study the source code of that module to discover if there is any other point in the module's operation where you could hook in using the sub wrap trick again. If you get lucky you'll find that dircopy is calling some function with enough information for you to grab the current filename. As a matter of fact, it probably is possible by using a "before" sub using that Hook::WrapSub technique, and examining the contents of @_. You should do this yourself, because it will be a good learning exercise, even if you are using it to introduce some spooky action at a distance to your dircopy calls. But we shouldn't deprive you the opportunity to study and learn the source code, and come up with a solution.

If you recall from an earlier post, I used Hook::WrapSub. That may still be useful if you are using File::Copy::Recursive (I'm assuming you've abandoned File::Find now, but it's getting confusing following what you're doing). Study the source code for File::Copy::Recursive, Figure out where you can hook in using Hook::WrapSub, create a 'before' or 'after' sub (probably before, in this case), use Data::Dumper, and dump the contents of @_ so you can see what gets passed into the sub you are wrapping. That will help you know if you will be able to find a source filename or directory name.

I do want to mention again, outputting dots was your original request, and I showed how to make that happen. It's a bit of an ugly hack, but it worked. There's nothing about it that I would consider a best practice. I wouldn't want to find that in "production code", for example.


Replies are listed 'Best First'.
Re^2: File::Copy::Recursive - current file/directory being processed
by bigal_george (Acolyte) on Jan 01, 2020 at 04:41 UTC
    Hi Dave. I will post what Ive done so far it might be useful. Certainly the tests I am having are showing some really good speed advantages in Linux anyways. As per most of my software works in the past, this is done for love, not for monetary gain and the only thing I would say is for any future user to take absolute care when using it as Ive been programming in Perl for only 3 days and there are bound to be some errors. Davo, your info has been extremely helpful and you pointers taken. I had just been checking the recursive module and it calls File::Copy::copy I did try to hook into that waste of time, yours works, but it seems to be far too re-entrant so I get about 10 x the number of files and directories as a result in dircopy(), more accurate on pathrm() The code below is still under testing, so user beware and there are some extra print statements and tests so its not ready for user. Any tips appreciated.
    #!/usr/bin/perl -w # Fastcmd - Fast Copy/Move/Delete whole directory trees like mv -rd rm + -rd #moded by A George; use at your own risk! #EXAMPLE call under linux or windows: perl /home/alistair/perl/fastcmd del . (deletes everything in current directory) #tested under Linux deleted 30,000 files in 4 seconds on ext4 hdd part +ition #ui error (not bug) couldn't rmdir directory .: Invalid argument at /h +ome/alistair/perl/ line 17. use File::Find; use File::Path qw(make_path); use strict; use warnings FATAL=>"all"; use File::Copy::Recursive qw(fcopy rcopy dircopy fmove rmove dirmove); use File::Basename; use 5.010; $SIG{INT} = \&interrupt; #trap Ctrl-C my ($function, $sourcedir, $targetdir) = @ARGV; die ": missing operand. Usage: $0 <OPTION>... <SOURCE_DIRECTORY>... [T +ARGET_DIRECTORY]... OPTION all the FILE(s) and DIRECTORIES: -c = copy or -co = copy with overwrite -m = move or -mo = move with overwrite -d = delete \n" unless (defined $function && defined $sourcedir); #@ARGV; #check validity input opendir(my $source_handle, $sourcedir) or die "Can't opendir $sourcedi +r: $!\n"; if (!defined($targetdir)) {$targetdir = '.'}; my $overwrite = 0 + (length $function>2) && $function =~ m/..o/ ; #strip last character (dont need the 'o' now) $overwrite>0 && chop($function); #print STDERR "$overwrite $function\n"; #$File::Copy::Recursive::CPRFComp = 1; $File::Copy::Recursive::SkipFlop = 1; #use below later for logfile #$File::Copy::Recursive::RMTrgDir = $overwrite; #$File::Copy::Recursive::RMTrgFil = $overwrite; #Hook information subroutine use Hook::WrapSub qw(wrap_subs); use IO::Handle; use Time::HiRes qw(gettimeofday); my $t0 = gettimeofday( ); my $FDcount = 0; my $lastitem = ""; my $num_of_files_and_dirs = 0; use File::Spec::Functions; sub after_dircopy { #if ($lastitem eq $!) {print STDERR "\n$lastitem"; return;} #$lastitem = $!; my $t1 = gettimeofday( ); my $elapsed = $t1 - $t0; printf("\rFile: Elapsed time since start: H%02d:M%02d:S%02d Approx num +ber of files and directories processed: %02d Ctrl-C breaks", ($t1 - $t0) / (60*60), ($t1 - $t0) / ( 60) % 60, ($t1 - $t0) % 60, ++$FDcount); STDOUT->flush; } wrap_subs sub {}, 'File::Copy::Recursive::fcopy','File::Copy::Recursiv +e::pathrm','File::Copy::Recursive::pathempty',\&after_dircopy; #if user enters './' we replicate source directory, otherwise if '.' d +estination current, otherwise if '' destination still current '.' if ($targetdir =~ /^[.\/]{2}$/) {$targetdir = './'.basename($sourcedir +);} #if ($function ne "-d") #{ #print"\n$overwrite $function, $sourcedir, $targetdir \nConfirm you wi +sh to proceed y/n?"; #chomp(my $input = <STDIN>); #if ($input !~ /^[Y]?$/i) {exit();} #} if ($function eq "-d") #DELETE EVERYTHING { print "This script is DANGEROUS - Are you sure you want to delete EVER +YTHING in directory ",$sourcedir," (y/n): "; #,@ARGV," (y/n): "; chomp(my $input = <STDIN>); if ($input !~ /^[Y]?$/i) {exit();} File::Copy::Recursive::pathrm("$sourcedir/","1") or print"\nProcess co +mpleted\n"; exit; } elsif ($function eq "-m") #MOVE ROUTINE { exit; length $targetdir>0 or die "No target directory given $targetdir/: $!\ +n"; make_path("$targetdir/") or die "Cannot create $targetdir/: $!\n"; $num_of_files_and_dirs = dirmove("$sourcedir/", "$targetdir/") or die "Cannot move $sourcedir/: $!\n"; print"\nFiles & Dirs removed: $num_of_files_and_dirs\n"; exit; } else #COPY ROUTINE { length $targetdir>0 or die "No target directory given $targetdir/: $!\ +n"; $num_of_files_and_dirs = dircopy("$sourcedir/", "$targetdir/") or die +"\nCannot copy $!"; print "\nFiles & Dirs copied: $num_of_files_and_dirs\n"; exit; } sub interrupt {die "\nProcess interrupted with by Ctrl-C\n Number of F +iles: $num_of_files_and_dirs/ processed\n"; exit(0);} #ENDS
      Certainly the tests I am having are showing some really good speed advantages in Linux anyways.

      Speed advantages compared to what? While implementing these commands in Perl can be a good exercise, I would be very surprised if a Perl program is faster than the system rm, mv, and cp commands, all of which are implemented in C on GNU systems — and most distributions of Linux use the GNU coreutils package. All of those commands accept the -v option to list the files on which they act and that can be easily translated to "running dots" with Awk: (untested)

      cp -v $FROM $TO | awk '{print "."}'

      I am not entirely convinced that Perl is the right tool for this job, unless, again, you are doing this as a programming exercise.