good chemistry is complicated, and a little bit messy -LW |
|
PerlMonks |
Re: Duplicate Randoms with SSIby Rhandom (Curate) |
on Oct 26, 2007 at 16:24 UTC ( [id://647429]=note: print w/replies, xml ) | Need Help?? |
Each of those SSI calls will be forking off another process and will have no way to communicate up to the parent process that something has happened.
You do have something else that could potentially work. It has been long enough since I have used an SSI that I don't remember if REMOTE_ADDR, HTTP_REFERER, or REMOTE_USER are set inside of an SSI process - I think that they should be (REMOTE_USER would only be set in an htauthed area). As long as even one of those are set, you can use a solution similar to the following. It isn't 100% accurate - but it should be good enough: You should note that I have used memcached here. My reasoning for doing it is you can have a small localized chunk of memory allocated for this very temporary, very dynamic system and you can insert entries into memcache and then forget about them. The old entries and unused entries are automatically dropped as new entries use up the available memcache space. I would say that for the use you have mentioned here, having a memcache daemon running with only 1MB of allocation would be sufficient for what you are doing. Oh - and this solution should add very little overhead to your process.
my @a=qw(random brilliant braindead); print $a[rand(@a)];
In Section
Seekers of Perl Wisdom
|
|