kingdean has asked for the wisdom of the Perl Monks concerning the following question:
I have a bunch of little perl scripts running. Each page calls upon a small perl script and is generated on the fly. So if 10 people visit that page the page is generated for each of them each time.
How can I say, once an hour or so, have the page generated so that when someone goes to that page it is never generated on the fly?
It is a regular .html file, the perl scrips are calling upon other .html files about 25k-100k each, and the scripts cut info from other pages on my site so that I have have the same info show up on many pages. I know this isn't the best way to probably do this, but it is working now and I don't want to change the system yet.
Any help on this would be appreciated!
Thanks
Dean
Re: Perl scripts making my site slow
by jeffa (Bishop) on Jun 25, 2004 at 18:49 UTC
|
You could set up a cron job that ran each script and directed it's output to the appropriate file.
You would probably want to lock the files first, to prevent someone from a requesting page at the moment it was being updated. I would be very tempted to solve this problem with Template Toolkit's
ttree utility. This looks like the kind of job that tool was made for.
| [reply] |
|
Jeff,
I am a novice, but I am learning. Let me give you an idea what I was thinking about and you tell me if this makes sense.
If I update my index.html file, I upload it and everything is fine. The index.html calls upon several pages and updates automatically. If I put this index.html file in a directory say... (www.mysite.com/dontgohere/index.html) and I make a cron job look at that file and run it, then dump the page as a visitor would see in html into the file (www.mysite.com/index.html). Then when someone accesses my main page, it is just a html file and shouldn't be slow.
Does this sound like it would work?
Thanks
Dean
| [reply] |
|
That sounds like it should work, but we are leaving out the specifics. Why don't you try
this. Write a cron job that runs one of these scripts you speak of. Have it write the output to
some .html file that a user will view. Set the cron job for every minute or 5 minutes until you
get it working correctly, then set it to run ever a hour or two. If you have problems with this,
then ask us specific questions about that in a new thread.
Then, once you get all of that working ... you can look into Template Toolkit should you want
to "upgrade" your system. ;) Cheers and good luck!
| [reply] |
Re: Perl scripts making my site slow
by Grygonos (Chaplain) on Jun 25, 2004 at 19:12 UTC
|
Alternatively, you can check the timestamp of the file and compare it against current time, to see if the current page request should generate a new file. (Whilst locking the files such as jeffa suggested). This is a better suggestion if you're on a webserver where you can't use cron jobs.
| [reply] |
|