Your skill will accomplish what the force of many cannot |
|
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Load and complexity of the situation are big factors in what to do. As sundialsvc4 points out, if you have a complex situation, go find a product to take care of this for you.
We had a need for something like this, it was (and will remain) quite simple. Yes, a web front-end with possible many inputs that must be worked on sequentially or else in very controlled batches. Parellel processing caused big problems for us. Interestingly, we could process 1 request in time T, but we could process 30 requests in time 2T; so it was pretty simple to stay caught up. Web pages insert the data into a DB table as a queue. That gives us persistance. The CGI then kicks a process with a signal to wake it up and then moves on. The CGI doesn't have to report a status, just queue the work. The worker sits there and when woken, starts by reading up to X items from the queue (X differs by time of day). It works on them for about 10 minutes before saving the results, then it removes the X items from the queue. Note, for us, if the work is done twice because something bad happened (which is extremely rare), it's OK because the earlier results are merely replaced with the same data. The program goes in a loop doing X items at a time until the queue is empty and then goes back to sleep. The program is 105 lines of perl code including a number of comments and blank lines; it's not hard a hard program to write if you take care. *BUT* we have very simple needs and those needs haven't changed since some tweaking at the very beginning (basically trial and error to find the correct size of X). If your needs are simple, rolling your own can work fine. If I needed the complexity of cron+make+otherstuff, I'd seriously look to the commercial world for a product. HTH, Kevin In reply to Re: Queuing system for running a perl program
by kbrannen
|
|