note
OeufMayo
<p>I actually whipped up a script last week, that is scheduled to run once a day, which fetches the [Best Nodes] and [Worst Nodes] nodes in raw mode. I don't have any post-processing done for the HTML I'm getting back, but it shouldn't be too hard to make a searchable archives for previous Best/Worst nodes.</p>
<code>
#!/usr/bin/perl -w
use strict;
use LWP::Simple ('getstore');
my @d = localtime;
my $file = sprintf("%4i%02i%02i", $d[5]+1900, ++$d[4], $d[3]);
for ('best', 'worst'){
my $url = "http://perlmonks.org/index.pl?node=$_+nodes&displaytype=raw";
my $html = getstore($url, 'bestworst/' . $file . "$_.html") or warn "Cannot get $url: $!\n";
}
exit 0;
</code>
<kbd>-- <br>
my $<a href="/index.pl?node=OeufMayo&lastnode_id=1072">OeufMayo</a> = new PerlMonger::Paris({http => '<a href="http://paris.mongueurs.net">paris.mongueurs.net</a>'});</kbd>
92597
92597