Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re^4: Parallel downloading under Win32?

by Xenofur (Monk)
on Apr 29, 2009 at 12:33 UTC ( [id://760861]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Parallel downloading under Win32?
in thread Parallel downloading under Win32?

I'm amazed that that kind of information is not in the description of the command itself. Anyhow, that being a Win32-specific thing means it's useless for me, as i don't want to lose Linux compatibility.
  • Comment on Re^4: Parallel downloading under Win32?

Replies are listed 'Best First'.
Re^5: Parallel downloading under Win32?
by ikegami (Patriarch) on Apr 29, 2009 at 12:56 UTC

    Officially, it's an internal call you're suppose to access through modules such as IPC::Open2 and IPC::Open3, both of which are documented and portable.

    But yeah, nothing about "1," makes any sense potato.

      Thank you!

      I had no idea something like that exists.

      After reading up on IPC::Open2, this works perfectly:
      use IPC::Open2; for my $id (@ids) { $wgets++; push @pids, open2(undef, undef, 'wget', $url.$id, '-q', '-O', +$dir.$id); while ( @pids >= 10 ) { waitpid( shift @pids, 0 ); } } while ( @pids ) { waitpid( shift @pids, 0 ); }
      And as far as i can tell, it'll be completely cross-platform-compatible, as long as i provide wget.exe. :)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://760861]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others romping around the Monastery: (4)
As of 2026-01-17 15:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    What's your view on AI coding assistants?





    Results (121 votes). Check out past polls.

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.