|
|
| Think about Loose Coupling | |
| PerlMonks |
Re^6: Parallel downloading under Win32?by Xenofur (Monk) |
| on Apr 30, 2009 at 09:30 UTC ( [id://761064]=note: print w/replies, xml ) | Need Help?? |
|
I've been running 40 instances of wget at a time, with this used to monitor network activity: http://www.hageltech.com/dumeter/ This is opposed to 20 threads with your solution. If you want to try it out for yourself, i'm loading from this url: http://api.eve-central.com/api/quicklook?typeid=24312 , with the parameter cycling through these indexes: <Reveal this spoiler or all in this thread>
Regarding the preloading of URLs: The maximum amount of urls i'll need to load is ~10000. From what i can tell the overhead of pre-loading is neglible in contrast to the actual downloading itself. Plus, as it is it makes reading the code easier for me. :) Memory use itself is not THAT much of an issue. I'm fine with taking up half a GB, what i was not fine with were other solutions that would quickly balloon to 1.5 GB. I know that the best way to handle threads is to create them at the start of the app in a begin block, but that isn't really an option here, as it's a CGI::App web application and there isn't really a way to know whether it'll actually do the downloading without actually loading the CGI::App stuff as well. Thanks for the information and advice in either case, i'll keep them in mind. :)
In Section
Seekers of Perl Wisdom
|
|
||||||||||||||||||||||||||||||||||||||||