http://www.perlmonks.org?node_id=1004479

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I have a simple crawler that is based on LWP::UserAgent. It loops through a bunch of database records and fetches and processes web content related to each record. The problem is that the program will unexpectedly terminate at some point with not indication of the reason. This could happen a few minutes or a few hours into the processing. And when this happens the exit code of the program is 0. I am using Carp::Alway to enable stack-tracing and have added an END block as well as a DESTROY method to my crawler and neither of those produces any output when it terminates. I have also tried to override CORE::Global::exit. My code never calls exit and I'm positive I'm not accidentally short-circuiting my loop, otherwise the die after run would get called. Below is a skeleton of my code. Is there something else I can use to try and narrow this down?
#!/usr/bin/env perl use 5.016; use warnings; use Carp::Always; BEGIN { *CORE::GLOBAL::exit = sub { die } } END { say "END block"; } my $crawler = Crawler->new; $crawler->run; die; package Crawler { use parent 'LWP::UserAgent'; # use a bunch of other modules for the processing. sub new { my $self = $class->SUPER::new(%args); $self->{ids} = (1..100); # These normally are pulled from a D +B. return $self; } sub run { my $self = shift; my @idx = shuffle 0 .. -1 + @{ $self->{ids} }; my $cur = 0; for my $i (@idx) { my $id = $self->{ids}[$i]; printf "%d/%d: %s (%d)\n", ++$cur, 0+@idx, $id, $i; # Fetch and process stuff related to this id. } } sub DESTROY { say "Crawler destroyed" } }