http://www.perlmonks.org?node_id=584740

perlmonkey2 has asked for the wisdom of the Perl Monks concerning the following question:

Hello monks, I've been using LWP::Parallel::UserAgent for quite some time now for an academic webminer. But I'm having a problem I just can not get around. After the application has been running for some time, all the open sockets will fail with a (timeout). I'm not sure what is going on or how to correct for it. But my application works by implementing a callback for on_return and then parsing the HTML and registering any new URLs that might be of interest. I have my own DNS caching server, and I only pull one file per domain ever > 5 minutes. Here is my constructor:
my $ua = Spider::LWP->new($depth,$path,$max_sockets,$ignore,$exclude); $ua->duplicates(0);#don't ignore duplicates here as this is done in th +e subclass a gazillion times more efficiently $ua->cookie_jar({});#where else would you store cookies? $ua->redirect(1);#follow redirects. HACKED BASE LIBRARY to make this +work with the subclass. $ua->in_order(1);#do the urls in order, as we randomize their entry in +to the queue. $ua->remember_failures(0);#don't remember failures here as the lib sto +res the entire object. This is done in the subclass $ua->max_hosts($max_sockets);#max open requests at any given moment $ua->max_req(1);#max requests per host $ua->nonblock(1);#don't block on LWP::UserAgent socket reads
Then register the beginning URLs:
$ua->wait(300); # block until we are all finished or until everything + has stopped for 5 minutes
Can anyone see anything wrong with this? What I think is happening is, and maybe I'm way off course here, but that for some reason a socket gets BLOCKED and times out and this timeout causes all the other sockets to timeout.