Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: Copy Files from network

by periapt (Hermit)
on Nov 17, 2004 at 13:45 UTC ( #408403=note: print w/ replies, xml ) Need Help??


in reply to Copy Files from network

Since you are already using the system copy command, you could try invoking a seperate process for each of the five computers. Something like

# assuming that you copy all the files in the local directory to all c +omputers # if not, you can change the src and computer variables to reference o +nly the # files you need the start copy portion remains the same my @computer = qw( 'comp1\dir\*.*' 'comp2\dir\*.*' 'comp3\dir\*.*' 'comp4\dir\*.*' 'comp5\dir\*.*'); my @rtncd = (); my $src = 'local\directory\*.*'; $rtncd[$_] = (system("start copy /Y /Z $src $computer[$_]")>>8) for +(0..$#computer); ... # handle return codes here
This should start up five seperate processes in Win2K

PJ
use strict; use warnings; use diagnostics;


Comment on Re: Copy Files from network
Download Code
Re^2: Copy Files from network
by gpurusho (Acolyte) on Nov 18, 2004 at 06:25 UTC
    further details:
    Assume Server is "S". I have 5 destinations A, B, C, D, E

    the process I follow is, copy from "S" to my local drive. Then copy to all other 5 machines. I find that the initiall copy, from "S", takes a long time. This is where I am readiding throught the parameter file and copying the files one by one. How can I read two or more files at a time and copy 2 or more files at a time. I haven't done threads or forks. Can someone help me in this regard.

    Thanks.

      OK, Luca Benini has a pretty good idea. However, I'm assuming that you can't run seperate processes on the downstream machines. You might still be able to use several processes. Based on your comment above, the bottleneck is the copy from server S (for whatever reason).

      Maybe some asyncronous tranfer scheme would help out. If the big bottleneck is bandwith from S, start several transfers (each in their own process using "start") copying one or more files from S to local until you are using all your bandwith. After this first transfer is over, initiate a second series of transfers, fewer than the first run so that you use up only about 1/2 your local bandwidth. Take the remaining 1/2 of your bandwidth and intiate transfers of the data you have already downloaded to the downstream systems.

      Note that you can start multiple process transfers in Win2K by anding your start calls like "start copy a: b: & start copy a: c: & start copy a: d:" and so on. Windows should start three seperate copy actions without waiting for the first one to finish

      Come to think of it. My previous post should have been something like
      # instead of $rtncd[$_] = (system("start copy /Y /Z $src $computer[$_]")>>8) for (0 +..$#computer); # use something like ... my @cmdstr = (); push @cmdstr, "start copy /Y /Z $src $computer[$_]")>>8) for (0..$#co +mputer); my $cmdstr = join(' & ', @cmdstr); $rtncd = system("$cmdstr");
      These ideas are untested for the most part. I'm not even sure how you might figure out your bandwith limits but maybe it is a place to start?

      PJ
      use strict; use warnings; use diagnostics;

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://408403]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (11)
As of 2014-09-01 11:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite cookbook is:










    Results (6 votes), past polls