Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Copy folders

by avik1612
on Aug 28, 2007 at 06:22 UTC ( #635503=perlquestion: print w/ replies, xml ) Need Help??
avik1612 has asked for the wisdom of the Perl Monks concerning the following question:

How to copy folder from onr location to another i tried using File::Copy but it is not working Thanks in advance Avi

Comment on Copy folders
Replies are listed 'Best First'.
Re: Copy folders
by Samy_rio (Vicar) on Aug 28, 2007 at 07:21 UTC

    Hi avik1612, copy the folder using File::Copy::Recursive

    use strict; use warnings; use File::Copy::Recursive qw(dircopy); dircopy('E:\test\split\Output','C:\New') or die("$!\n");

    Velusamy R.

    eval"print uc\"\\c$_\""for split'','j)@,/6%@0%2,`e@3!-9v2)/@|6%,53!-9@2~j';

Re: Copy folders
by andreas1234567 (Vicar) on Aug 28, 2007 at 07:14 UTC
    File::Copy's copy function copies files, not directories. You can use File::Find to find the files and use those as input to File::Copy.
    use strict; use warnings; use File::Copy; use File::Find; my @fromdirs = qw (/tmp/foo); my $todir = q{/tmp/bar}; find( sub { if (-f) { print "$File::Find::name -> $todir"; copy($File::Find::name, $todir) or die(q{copy failed:} . $!); } }, @fromdirs ); __END__ $ mkdir -p /tmp/foo $ mkdir -p /tmp/bar $ touch /tmp/foo/1 $ touch /tmp/foo/2 $ perl -l /tmp/foo/1 -> /tmp/bar /tmp/foo/2 -> /tmp/bar
Re: Copy folders
by moritz (Cardinal) on Aug 28, 2007 at 06:59 UTC
Re: Copy folders
by SM177Y (Initiate) on Sep 13, 2015 at 12:57 UTC

    Here is a pure perl method I made up after endless searching and having people tell me to use modules. This is such a simple concept, a required external module seems pretty lame. I have also made a recursive directory deleter so I/You can eliminate the need for File::Path qw(remove_tree) and File::Copy::Recursive qw(rcopy) altogether. If you use these together you can easily conduct a dir_move. And finally I included an fmove(); (you could use rename for certain instances obviously and fmove is identical to fcopy except for a final unlink) but if you were to combine fmove() into dircopy you could make a single sub/function to move a folder in one shot instead of recursive copy and then recursive delete.

    #!/usr/bin/env perl use strict; use warnings; dircopy($ARGV[0],$ARGV[1]); sub dircopy { my @dirlist=($_[0]); my @dircopy=($_[1]); until (scalar(@dirlist)==0) { mkdir "$dircopy[0]"; opendir my($dh),$dirlist[0]; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i (0..scalar(@filelist)-1) { if ( -f "$dirlist[0]/$filelist[$i]" ) { fcopy("$dirlist[0]/$filelist[$i]","$dircopy[0]/$filelist[$ +i]"); } if ( -d "$dirlist[0]/$filelist[$i]" ) { push @dirlist,"$dirlist[0]/$filelist[$i]"; push @dircopy,"$dircopy[0]/$filelist[$i]"; } } closedir $dh; shift @dirlist;shift @dircopy; } } sub fcopy { my ($i,$data,$cpo,$cpn); open($cpo,"<",$_[0]) or die $!; binmode($cpo); open($cpn,">",$_[1]) or die $!; binmode($cpn); while (($i=sysread $cpo,$data,4096)!=0){print $cpn $data}; close($cpn);close($cpo); }

    And here is the recursive directory deleter...

    #!/usr/bin/env perl use strict; use warnings; dir_del($ARGV[0]); sub dir_del { my @dirlist=("$_[0]"); my $r=0; my $s=scalar(@dirlist); while ( $r < $s ) { opendir my($dh),$dirlist[$r]; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i ( 0..scalar(@filelist)-1 ) { if ( -f "$dirlist[$r]/$filelist[$i]" ) { unlink("$dirlist[$r]/$filelist[$i]"); } else { push @dirlist,"$dirlist[$r]/$filelist[$i]"; $s=scalar(@dirlist); } } closedir $dh;$r+=1; } my $a=scalar(@dirlist)-1; for my $i ( 0..scalar(@dirlist)-1 ) { rmdir($dirlist[$a]);$a-=1; } }

    Just a side note: When using these make sure you run -d tests if you are not certain your target is a folder or a file. If it's just a file you can } else { unlink(file);}

    sub fmove { my ($i,$data,$mvo,$mvn); open($mvo,"<",$_[0]) or die $!; binmode($mvo); open($mvn,">",$_[1]) or die $!; binmode($mvn); while (($i=sysread $mvo,$data,4096)!=0){print $mvn $data}; close($mvn);close($mvo);unlink($_[0]); }
      #!/usr/bin/env perl use strict; use warnings; dircopy("$ARGV[0]","$ARGV[1]"); sub dircopy { my @dirlist=("$_[0]"); my @dircopy=("$_[1]"); until (scalar(@dirlist)==0) { mkdir "$dircopy[0]"; opendir my($dh),"$dirlist[0]"; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i (0..scalar(@filelist)-1) { if ( -f "$dirlist[0]/$filelist[$i]" ) { fcopy("$dirlist[0]/$filelist[$i]","$dircopy[0]/$filelist[$ +i]"); } if ( -d "$dirlist[0]/$filelist[$i]" ) { push @dirlist,"$dirlist[0]/$filelist[$i]"; push @dircopy,"$dircopy[0]/$filelist[$i]"; } } closedir $dh; shift @dirlist;shift @dircopy; } } sub fcopy { my ($i,$data,$cpo,$cpn); open($cpo,"<","$_[0]") or die $!; binmode($cpo); open($cpn,">","$_[1]") or die $!; binmode($cpn); while (($i=sysread $cpo,$data,4096)!=0){print $cpn $data}; close($cpn);close($cpo); }
      #!/usr/bin/env perl use strict; use warnings; dir_del("$ARGV[0]"); sub dir_del { my @dirlist=("$_[0]"); until (scalar(@dirlist)==0) { opendir my($dh),"$dirlist[0]"; my @filelist=grep {!/^\.\.?$/} readdir $dh; for my $i (0..scalar(@filelist)-1) { if ( -f "$dirlist[0]/$filelist[$i]" ) { unlink("$dirlist[0]/$filelist[$i]"); } if ( -d "$dirlist[0]/$filelist[$i]" ) { push @dirlist,"$dirlist[0]/$filelist[$i]"; } } closedir $dh;rmdir "$dirlist[0]";shift @dirlist; } rmdir "$_[0]"; }
      sub fmove { my ($i,$data,$mvo,$mvn); open($mvo,"<","$_[0]") or die $!; binmode($mvo); open($mvn,">","$_[1]") or die $!; binmode($mvn); while (($i=sysread $mvo,$data,4096)!=0){print $mvn $data}; close($mvn);close($mvo);unlink("$_[0]"); }

      Sorry, this is junk. Here is why:

      Reinventing the wheel, poorly:
      • File::Copy implements copying of files. it is in core (part of the perl distribution) since perl 5.002, about 20 years ago.
      • File::Path implements removing directory trees, it is in core since perl 5.001.
      • File::Copy::Recursive implements copying directory trees since at least 2008. It is not a core module, but it is a pure perl module. Everybody can use it, even without a working C compiler, just by copying it to some directory where perl can find it.
      Misbehaving code:
      • Missing error checks on mkdir, opendir, closedir, readdir, sysread, print, close, unlink, rmdir. All of these functions can fail, and any program should check if they failed.
      • Missing symlink handling. -f and -d use stat, not lstat, so symlinks to files and symlinks to directories are treated like regular files and directories. This (a) breaks any symlink "copied" by your code and (b) makes your code run into an infinite loop as soon as a directory contains a symlink to a directory above the current directory.
      • Missing special file handling. Your code completely ignores device files, pipes, sockets, and FIFOs.
      • Missing permission handling. Your code completely ignores file permissions. All files and directories "copied" by your code will have default permissions. With the usual umask of 022, this means all executables will loose their execute permissions, and, to make things worse, the copies of all private files that had their group and world read permissions removed will be group and world readable.
      • No attempts to make sure files aren't group and world readable while copying.
      • Repeated (implicit) stat calls in the -f and -d tests. Better call lstat once, then use the special filehandle _ as argument for -d, -f, and the other required tests.
      • sub fmove does not even attempt to use rename to move the file, which usually works as long as source and destination reside on the same filesystem, but stupidly copies the file content around, breaks permissions, timestamps, and ignores almost all possible errors.
      • The "recursive directory deleter" won't work reliable, as it attempts to rmdir directories at the end of the until loop before the next round of the loop deletes the subdirectories. Of course you could not see that because you did not check for errors from rmdir, and I guess you did not try to remove a whole directory tree.
      • To make things worse, the final rm "$_[]0]"; hides this problem for the case of a directory with only one level of subdirectories.
      • And once again, the "recursive directory deleter" does not handle symlinks properly: Symlinks to directories elsewhere will be followed, so random directories outside the directory passed to sub dir_del will be deleted. The symlink itself will stay, because rmdir can't remove symlinks. Symlinks to special files are ignored, they won't be deleted. Symlinks to plain files are accidentally handled properly, they will be deleted due to the -f test.
      • Talking of special files, everything not a plain file, directory, or symlink to a plain file won't be removed. It can't remove subdirectories that contain device files, pipes, sockets, or FIFOs. Symlinks to files are handled "accidentally", because -f treats them as files, and unlink can handle them.
      Ugly code:
      • Inconsistent indenting. If your editor is too stupid and you are too lazy for proper indenting, use an indenting program to make your code readable.
      • More than one command per line. Why?
      • Useless quoted variables almost everywhere: dircopy("$ARGV[0]","$ARGV[1]");, my @dirlist=("$_[0]");, my @dircopy=("$_[1]");, mkdir "$dircopy[0]";, ... Just omit the quotes.
      • Two identical arrays where one is sufficient in sub dircopy (@dirlist and @dircopy)
      • Iterating over array indices instead of array elements in sub dircopy: for my $i (0..scalar(@filelist)-1) is better written as for my $name (@filelist), using $name instead of $filelist[$i] inside the loop.
      • Reading the entire directory list into RAM may be problematic with huge directories and low memory conditions. while (my $fn=readdir $dh) reads one filename at a time.
      • Missing filename in error messages of open.


      Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
        Ha. A simple, "I don't like what you did" would be fine. It was mostly a test exercise to see if I could do it to be honest. And it does work properly despite lack of some error checking which is why I sorta hinted at it needing some. As far as indenting I prefer to tab while others use spacing for portability. Keeps the code smaller. And I happen to like combining similar short commands on a single line. Just my preference. Isn't that what perl is sorta about....freedom in coding style. Also the arrays aren't identical due to the new folder tree having a different name. That's just how I did it. The only real reason I even did this was to eliminate the requirements for unnecessary modules in a specific piece of code I was working on. In my code I know there won't be any symlinks in its use which is why I didn't bother. All in all, point being that from my experience I've had modules not build on certain systems at all or operate differently from 32 to 64bit, different versions of perl, cpan can be a PITA some the fewer I can require in my code, the better. This was obviously a "crude" example. I appreciate your input. It just didn't need to be so...jump the gunish angry :P. Peace.
      … having people tell me to use modules. … a required external module seems pretty lame.
      I think these words were what itched afoken (and possibly other monks, too). A great deal of Perl's power comes from its already available modules. The "extern"-ness of File::Copy and File::Path is such that it is just a use statement away - no installation needed.

      Nothing against writing this anew from scratch -- as a learning experience, to explore new ways, or for special needs. But the possibility to use modules (and to create them, for your own custom logic that you need in multiple places) is one of Perl's strengths.

        I understand completely. I see now that my statement could have been interpreted a little harsher than I ever intended. It wasn't really a bash on solid modules at all. It was more of a personal complaint that the language didn't have built in functions as simple as deleting/copying/moving a folder recursively. I do thank you for your input. I also updated the dir_del. Found a bug in it lol. I am working on a small package of subs that will allow all file/folder copy/move/delete from all perl built-ins. Once again, mostly for fun and to see it be done. Nothing against modules as a whole. I think they're great and I have had the pleasure of using quite a few thus far. I've also had the displeasure of dealing with broken ones that don't build and having to make native workarounds like this...I sort of think it's fun. I never intended to offend anyone and yes I am new on here and "relatively" newish to perl I would say. Very strong Bash,BASIC,HTML,*nix administration/networking background so picking up on the basics of Perl was pretty easy for me but I admit I am still learning some of the do's and do not's. I appreciate honest critics as to learn properly. Even if I do throw something out there prematurely or just plain awful when viewed from a veteran :P

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://635503]
Approved by ikegami
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others cooling their heels in the Monastery: (6)
As of 2015-10-13 22:29 GMT
Find Nodes?
    Voting Booth?

    Does Humor Belong in Programming?

    Results (316 votes), past polls