http://www.perlmonks.org?node_id=985378

NeonFlash has asked for the wisdom of the Perl Monks concerning the following question:

I have written a Perl Script using WWW::Mechanize which reads URLs from a text file and connects to them one by one. In each operation, it parses the content of the webpage looking for some specific keywords and if found, it will be written to the output file.
To speed up the process, I used Parallel::ForkManager with MAX_CHILDREN set to 3. Though I have observed an increase in the speed, the problem is that, after a while the script crashes. Perl.exe process gets killed and it does not display any specific error message.
I have run the script multiple times to see if it always fails at the same point, however the point of failure seems to be intermittent.
Please note that I have already taken care of any memory leaks in WWW::Mechanize and HTML::TreeBuilder::XPath as follows:
For WWW::Mechanize, I set stack_depth(0) so that it does not cache the history of visited pages.
HTML::TreeBuilder::XPath, I delete the root node once I am done with it. This approach helped me in resolving a memory leak issue in another similar script which does not use fork.
Here is the structure of the script, I have mentioned only the relevant parts here, please let me know if more details are required to troubleshoot:

#! /usr/bin/perl use HTML::TreeBuilder::XPath; use WWW::Mechanize; use warnings; use diagnostics; use constant MAX_CHILDREN => 3; open(INPUT,"<",$input) || die("Couldn't read from the file, $input wit +h error: $!\n"); open(OUTPUT, ">>", $output) || die("Couldn't open the file, $output wi +th error: $!\n"); $pm = Parallel::ForkManager->new(MAX_CHILDREN); $mech=WWW::Mechanize->new(); $mech->stack_depth(0); while(<INPUT>) { chomp $_; $url=$_; $pm->start() and next; $mech->get($url); if($mech->success) { $tree=HTML::TreeBuilder::XPath->new(); $tree->parse($mech->content); # do some processing here on the content and print the results to +OUTPUT file # once done then delete the root node $tree->delete(); } $pm->finish(); print "Child Processing finished\n"; # it never reaches this point! } $pm->wait_all_children;

1. I would like to know, why does this Perl script keep failing after a while?
2. For understanding purpose, I added a print statement right after the finish method of fork manager, however it does not print that.
3. I have also used, wait_all_children method, since as per the document of the module on CPAN, it will wait for the processing to get over for all the children of the parent process.
4. I have not understood why, wait_all_children method is place outside the while or the for loop though (as observed in the documentation as well), since all the processing is taking place inside the loop.
5. Memory usage of the perl.exe process keeps growing gradually even though I have taken care of the memory leaks in WWW::Mechanize and HTML::TreeBuilder
Thanks.