http://www.perlmonks.org?node_id=1068377


in reply to Proper way to thread this in PERL.

The idea is to spawn max threads once :) and use proper scoping and argument passing

Re^9: Async DNS with LWP/Re^13: Async DNS with LWP, Re^4: Perl/Tk vs Win32::SerialPort (stash, dispatch, queue), Re: Challenge: Perl 5: lazy sameFringe()?, Re^5: Consumes memory then crashs
threads::Q

This code untested but I've used this pattern before

#!/usr/bin/perl -- Main( @ARGV ); exit( 0 ); sub Main { ... ; ## GetOpt::Long / GetOpt::Declare ... UrlFile_FetchingThreads ( '/f/o/o/bar.txt', 4 ); ## UrlFile_FetchingThreads ( $filename, $maxthreads ); } sub UrlFile_FetchingThreads { my( $filename, $maxthreads ) = @_; my @urls = GetUrls( $filename ); my $qin = threads::Q->new(); my $qout = threads::Q->new(); $qin->nq( @urls ); for ( 1 .. $maxthreads ){ my $tt = threads->create( \&tryHTTP, $qin, $qout ); $tt->detach; } my $quitter = 0; while( 1 ){ if( my $res = $qout->dq_nb ){ doSomething( $ret ); } sleep 1; ### this part not used before, completely untested, probably too compl +ex if( not $qin->cq ){ $quitter++; } else { $quitter = 0; } if( not $qout->cq ){ if( $quitter > 5 ){ die "FINISHED, no more urls to process, no more respon +ses to process\n"; } } } } sub tryHTTP { threads->detach(); ## can't join me :) my( $qin, $qout ) = @_; while( 1 ){ my( $url ) = $qin->dq; ... my $res = $lwp->get( $url ); $qout->nq( $res ); } return; }

Re: simple multithreading with curl
Re: Perl crashing with Parallel::ForkManager and WWW::Mechanize
Re^3: Using LWP instead of wget?
Re^3: Fast fetching of HTML response code
Re^2: Need help with Perl multi threading
LWP::Parallel::UserAgent
LWP::Concurrent
Re^10: Consumes memory then crashs Re^9: Consumes memory then crashs Re: Are there any memory-efficient web scrapers?
Your main event may be another's side-show.
Re: Perl threads to open 200 http connections
Re^3: trying to get timeout to work (easier with threads)