http://www.perlmonks.org?node_id=894499

Gangabass has asked for the wisdom of the Perl Monks concerning the following question:

In my CGI script i make long (up to 10 seconds) request to another server, parsing results and show response to my user (via AJAX). But another server owner ask me to perform each request no more than 1 request per 10 seconds so:

  1. i need to save each request of my user;
  2. every ten seconds i can make only one request to another server;

First i think about Cron which will open simple text file (queue file), read first line and send it as a request to another server. After that it will save result in another file (where i'll cache all results). So my CGI will first check cache file and try to find result in it and after that (if result is not finded) it will save task in the queue file (for the Cron).

But Cron run only once per minute so my user must wait for so long time...

So how i can do this via CGI?

May be:

  1. After checking cache file CGI will estimate time to complete request (by reading current queue file) and send this estimation time to the HTML (where i can got this time and make another request after this time via AJAX).
  2. After that it will save request to the queue file and fork. The forked process will wait untill it's request will be on the top of the queue and will make request to another server.
  3. After that it will save result in the cache file. What you think?

May be some module already written for such tasks?

Replies are listed 'Best First'.
Re: How to emulate queue for CGI script?
by bart (Canon) on Mar 21, 2011 at 13:17 UTC
    I'd put the logic in the browser. You're already using Ajax, so Javascript is already enabled. It could go something like this.
    • The user composes a request (webform?) and sends it to your server by Ajax.
    • Your CGI script check how long it has been since a request has been sent to a particular server. If it's been less than 10 seconds (or there are other people in the queue) give back a negative response to the browser via the Ajax response, together with an estimate of how much time it should wait before retrying.
    • You might also keep a queue for each server, and pass back the queue position to the user, but you'd have to take into account that some users will not wait, so the queue might actually move faster than the first estimate. I'd have the server wait up to 2 seconds for a retry by the user, before moving on to the next person in the queue.
    • The browser can then alert the user of the wait time, wait for the indicated time, and try again (with window.setTimeout).
    • If the person is the first in the queue, service the request and return a positive answer.

      There is a problem with trying to do the rate-limiting in the browser: if there is more than one client, you'll blow the limit. I think it has to be done in the back-end.

      You ought to be able to get the effect you want by using CPAN modules. You may be able to simply use HTTP::Cache::Transparent.

        thargas,
        Perhaps you missed where the CGI checks how long it has been since a request was made? That will be for all clients. The trouble I see with it is a race condition where two clients check at the same time and both think it is ok. One actually gets delivered and the other gets queued with no notice to the user. I am sure that since we are doing whiz bang AJAX that we could tell the user the approximate wait time even after they are in the queue though.

        Cheers - L~R

      Thank you for idea. Here is code i have now: