For a project using
WWW::Mechanize for some site-scraping, I ended up needing to wrap most of the calls that do something requiring a network to allow them to be retried. The combination of networks and servers is just a little too flaky to make thousands of HTTP requests without any of them failing, but they usually go through if you retry them a couple of times.
I wanted a succint way to add this to any action, so I wrote it to take sub refs. I can call it like this:
retry(
sub {
$browser->get( $link->url() );
}
);
And this is the definition of the retry() sub:
sub retry {
my $sub_ref = shift;
for ( 1 .. $conf->max_tries() ) {
eval { $sub_ref->(); };
last unless $@;
warn "Failed try $_, retrying. Error: $@\n"
if $conf->debug();
}
if ($@) { die "failed after " . $conf->max_tries() . " tries: $@\n
+" }
}
This assumes that a failed action will throw an exception, which is what happens with the settings I'm using with Mechanize.
Now this works great, but I felt like it was sort of a naive approach, and I just wondered if anyone had something clever for this, maybe using some of those less commonly applied loop control constructs that Perl allows. I'm not looking for golf here (although feel free to amuse yourself if you think it sounds fun), but really just wondering if there's a more elegant solution.