I actually whipped up a script last week, that is scheduled to run once a day, which fetches the Best Nodes and Worst Nodes nodes in raw mode. I don't have any post-processing done for the HTML I'm getting back, but it shouldn't be too hard to make a searchable archives for previous Best/Worst nodes.
#!/usr/bin/perl -w
use strict;
use LWP::Simple ('getstore');
my @d = localtime;
my $file = sprintf("%4i%02i%02i", $d[5]+1900, ++$d[4], $d[3]);
for ('best', 'worst'){
my $url = "http://perlmonks.org/index.pl?node=$_+nodes&displa
+ytype=raw";
my $html = getstore($url, 'bestworst/' . $file . "$_.html") or
+ warn "Cannot get $url: $!\n";
}
exit 0;
<kbd>--
my $
OeufMayo = new PerlMonger::Paris({http => '
paris.mongueurs.net'});</kbd>