Clear questions and runnable code get the best and fastest answer |
|
PerlMonks |
Re^3: Copy Files from networkby periapt (Hermit) |
on Nov 18, 2004 at 21:34 UTC ( [id://408898]=note: print w/replies, xml ) | Need Help?? |
OK, Luca Benini has a pretty good idea. However, I'm assuming that you can't run seperate processes on the downstream machines. You might still be able to use several processes. Based on your comment above, the bottleneck is the copy from server S (for whatever reason). Maybe some asyncronous tranfer scheme would help out. If the big bottleneck is bandwith from S, start several transfers (each in their own process using "start") copying one or more files from S to local until you are using all your bandwith. After this first transfer is over, initiate a second series of transfers, fewer than the first run so that you use up only about 1/2 your local bandwidth. Take the remaining 1/2 of your bandwidth and intiate transfers of the data you have already downloaded to the downstream systems. Note that you can start multiple process transfers in Win2K by anding your start calls like "start copy a: b: & start copy a: c: & start copy a: d:" and so on. Windows should start three seperate copy actions without waiting for the first one to finish Come to think of it. My previous post should have been something like These ideas are untested for the most part. I'm not even sure how you might figure out your bandwith limits but maybe it is a place to start? PJ use strict; use warnings; use diagnostics;
In Section
Seekers of Perl Wisdom
|
|