Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: Critical sections; In the Perl interpreter

by davido (Cardinal)
on Feb 16, 2017 at 20:25 UTC ( [id://1182166]=note: print w/replies, xml ) Need Help??


in reply to Critical sections; In the Perl interpreter

I was asked by a friend, why after starting 150 web spider threads, his system would bog down until it seem to stop, but was still running?

He may be asking the wrong question. The real question might be something similar to, "Is it possible to do 150 simultaneous requests without dragging the system to its knees?" Look at the problem that needs to be solved and remain open to lighter-weight solution that might not have previously been considered.

# Concurrent non-blocking requests (synchronized with a delay) Mojo::IOLoop->delay( sub { my $delay = shift; $ua->get('mojolicious.org' => $delay->begin); $ua->get('cpan.org' => $delay->begin); }, sub { my ($delay, $mojo, $cpan) = @_; say $mojo->result->dom->at('title')->text; say $cpan->result->dom->at('title')->text; } )->wait;

...from the Mojo::UserAgent docs.

So that sub containing the gets could look like:

sub { my $delay = shift; $ua->get($_ => $delay->begin) for @list_of_urls; },

Dave

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1182166]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others wandering the Monastery: (5)
As of 2024-04-23 06:49 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found