good chemistry is complicated, and a little bit messy -LW |
|
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
As far as I can tell from this relatively sparse info, you're both going about this in an either complicated or inefficient way. How about this:
1. Whenever a pageview for a given timespan and event-selection is requested, generate that page, and write it to disc. If the same info is request again, make it as cheap as possible to serve the static page. 2. Whenever a new event is entered or modified/deleted, remove the written pages that are related to that timespan and selection. 3. Send out caching headers for an hour or so for each page. 4. Put a caching proxy in front of your site (definitely worth it, if you've got a fairly serious amount of visitors). This should ensure that you can serve quite a lot of views on very cheap hardware and hosting (somewhere between 100 and a couple of hundred euros a month in hosting and a few thousand euros of hardware) easily, assuming you're only adding/editing/deleting a handful of event/day on average. edit: 3 and 4 alone will give you a significant edge without while just generating each page at request (because only a small amount of popular requests will actually end up at your webserver)
In reply to Re: precalculating event dates vs.recalculating them.
by Joost
|
|