Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Re^4: What is the fastest way to download a bunch of web pages?

by tphyahoo (Vicar)
on Mar 03, 2005 at 13:15 UTC ( #436194=note: print w/replies, xml ) Need Help??


in reply to Re^3: What is the fastest way to download a bunch of web pages?
in thread What is the fastest way to download a bunch of web pages?

I had monkeyed with wget, but wget doesn't handle cookies, post requests, and redirects nearly as well as LWP, hence I'm doing it this way. My real program does more complicated stuff than I had presented in this post, but I just wanted to keep things simple.

With regards to the bottleneck, I don't think this will be a problem. I'm not writing a web crawler, this is just something to automate a bunch of form post requests, and massage the data I get back. But that doesn't matter for getting the threading part right.

I will eventually be storing stuff in mysql, but this is a future PM question....

  • Comment on Re^4: What is the fastest way to download a bunch of web pages?

Replies are listed 'Best First'.
Re^5: What is the fastest way to download a bunch of web pages?
by Anonymous Monk on Mar 03, 2005 at 15:09 UTC
    My real program does more complicated stuff than I had presented in this post, but I just wanted to keep things simple.
    But your question is far, far from simple. You ask an extremely broad question (in the sense that there are a lot of factors that play a role) whose answer isn't going to be purely Perl. Simplification only leads to suggestions (like 'wget') that isn't going to work for you. Note that wget has options to work with cookies - including saving/restoring them to/from file.
    I'm not writing a web crawler, this is just something to automate a bunch of form post requests, and massage the data I get back. But that doesn't matter for getting the threading part right.

    I will eventually be storing stuff in mysql, but this is a future PM question....

    So it seems like speed isn't going to be that important. Why aren't you first focussing on getting the functionality working, then worry about speed? Perhaps by the time it's finished, the speed issue has resolved itself (for instance, because it's already fast enough, or the database is going to be the problem, or the retrieval is been done in the background)

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://436194]
help
Chatterbox?
[choroba]: can't you create a meta-key corresponding to the disjunction of the events?
[robby_dobby]: Corion: Heh. This whole thing smells of Strategy Pattern or MVC pattern.
[Corion]: And performance linear to the number of registered one-shots doesn't feel that bad. Maybe I should collect statistics on how many callbacks are outstanding ;)
[Corion]: choroba: Yes, but the longer I thought about efficient hashes mapping the event type back to their callbacks, and how to keep them in sync, the more I thought that all that optimization might just not be worth it, even if it's horribly inelegant
[Lady_Aleena]: My biggest problem with hashes at the moment is one with 2,501 keys.
[choroba]: how many event types are there?
[Corion]: Also I found that I can't conveniently weaken an array slot, which also is inconvenient, as I want my one-shots to disappear if the caller discards them
[Corion]: choroba: Currently two or three that my program handles (WWW::Mechanize:: Chrome), but there might be more that become interesting
[Corion]: But I don't expect more than 100 to be active at the same time, so I'm not really sure if there is a not-too-fancy data structure that is maintained with few lines of code where the performance is better than the linear scan ;)
[Corion]: But I should do a mock-up program so that others can see what I'm talking about ;)

How do I use this? | Other CB clients
Other Users?
Others musing on the Monastery: (7)
As of 2017-05-29 07:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?