in reply to Re^3: What is the fastest way to download a bunch of web pages?
in thread What is the fastest way to download a bunch of web pages?
I had monkeyed with wget, but wget doesn't handle cookies, post requests, and redirects nearly as well as LWP, hence I'm doing it this way. My real program does more complicated stuff than I had presented in this post, but I just wanted to keep things simple.
With regards to the bottleneck, I don't think this will be a problem. I'm not writing a web crawler, this is just something to automate a bunch of form post requests, and massage the data I get back. But that doesn't matter for getting the threading part right.
I will eventually be storing stuff in mysql, but this is a future PM question....
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^5: What is the fastest way to download a bunch of web pages?
by Anonymous Monk on Mar 03, 2005 at 15:09 UTC |
In Section
Seekers of Perl Wisdom