note
bitingduck
<p>Last time I did something like that, I just used a shell script wrapped around my perl scraper, and did the upload with scp:</p>
<code>
#!/bin/bash
# scrape targets and load files to server
. $HOME/.bash_profile
cd $HOME/MyBot
FILENAME=`date "+scrapefiles/scrapetarget%C%y%m%d.txt"`
perl MyRobot.pl > $FILENAME
scp $FILENAME myusername@mydomain.com:templistings
</code>
<p>"templistings" is the remote directory I load my scrapings into, where a cron job looks for them periodically and loads them into a mysql database. In your case you would just specify the directory where you want your file to go in your web structure and just write it as a .html file. You could delete the local copy after, but I keep mine in an archive.</p>
<p>An alternative would be to use something like <a href="http://search.cpan.org/~ivan/Net-SCP-0.08/SCP.pm">Net::SCP</a> to do the whole thing from within your perl program. You could probably also have your program ssh in to server2 (with <a href="http://search.cpan.org/~turnstep/Net-SSH-Perl-1.34/lib/Net/SSH/Perl.pm">Net::SSH::Perl</a>) and write the file directly without making a local copy, but that seems a little more fragile to me.</p>
<p>If the problem is that you need it to run from inside a PHP page, that's a PHP problem, and there's probably better places to ask (though some here might be able to help).
991356
991356