Last time I did something like that, I just used a shell script wrapped around my perl scraper, and did the upload with scp:
# scrape targets and load files to server
perl MyRobot.pl > $FILENAME
scp $FILENAME firstname.lastname@example.org:templistings
"templistings" is the remote directory I load my scrapings into, where a cron job looks for them periodically and loads them into a mysql database. In your case you would just specify the directory where you want your file to go in your web structure and just write it as a .html file. You could delete the local copy after, but I keep mine in an archive.
An alternative would be to use something like Net::SCP to do the whole thing from within your perl program. You could probably also have your program ssh in to server2 (with Net::SSH::Perl) and write the file directly without making a local copy, but that seems a little more fragile to me.
If the problem is that you need it to run from inside a PHP page, that's a PHP problem, and there's probably better places to ask (though some here might be able to help).