Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Re^3: Parallel downloading under Win32?

by ikegami (Patriarch)
on Apr 29, 2009 at 12:16 UTC ( [id://760856]=note: print w/replies, xml ) Need Help??


in reply to Re^2: Parallel downloading under Win32?
in thread Parallel downloading under Win32?

The "1," is special. See perlport
  • Comment on Re^3: Parallel downloading under Win32?

Replies are listed 'Best First'.
Re^4: Parallel downloading under Win32?
by Xenofur (Monk) on Apr 29, 2009 at 12:33 UTC
    I'm amazed that that kind of information is not in the description of the command itself. Anyhow, that being a Win32-specific thing means it's useless for me, as i don't want to lose Linux compatibility.

      Officially, it's an internal call you're suppose to access through modules such as IPC::Open2 and IPC::Open3, both of which are documented and portable.

      But yeah, nothing about "1," makes any sense potato.

        Thank you!

        I had no idea something like that exists.

        After reading up on IPC::Open2, this works perfectly:
        use IPC::Open2; for my $id (@ids) { $wgets++; push @pids, open2(undef, undef, 'wget', $url.$id, '-q', '-O', +$dir.$id); while ( @pids >= 10 ) { waitpid( shift @pids, 0 ); } } while ( @pids ) { waitpid( shift @pids, 0 ); }
        And as far as i can tell, it'll be completely cross-platform-compatible, as long as i provide wget.exe. :)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://760856]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (5)
As of 2026-02-15 20:24 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.