Gangabass has asked for the wisdom of the Perl Monks concerning the following question:
In my CGI script i make long (up to 10 seconds) request to another server, parsing results and show response to my user (via AJAX). But another server owner ask me to perform each request no more than 1 request per 10 seconds so:
- i need to save each request of my user;
- every ten seconds i can make only one request to another server;
First i think about Cron which will open simple text file (queue file), read first line and send it as a request to another server. After that it will save result in another file (where i'll cache all results). So my CGI will first check cache file and try to find result in it and after that (if result is not finded) it will save task in the queue file (for the Cron).
But Cron run only once per minute so my user must wait for so long time...
So how i can do this via CGI?
May be:
- After checking cache file CGI will estimate time to complete request (by reading current queue file) and send this estimation time to the HTML (where i can got this time and make another request after this time via AJAX).
- After that it will save request to the queue file and fork. The forked process will wait untill it's request will be on the top of the queue and will make request to another server.
- After that it will save result in the cache file. What you think?
May be some module already written for such tasks?
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: How to emulate queue for CGI script?
by bart (Canon) on Mar 21, 2011 at 13:17 UTC | |
by thargas (Deacon) on Mar 21, 2011 at 15:04 UTC | |
by Limbic~Region (Chancellor) on Mar 21, 2011 at 17:18 UTC | |
by Gangabass (Vicar) on Mar 21, 2011 at 16:33 UTC |