Beefy Boxes and Bandwidth Generously Provided by pair Networks Frank
Perl-Sensitive Sunglasses

Forking in CGI scripts

by nextGenKid (Initiate)
on Nov 06, 2012 at 08:33 UTC ( #1002431=perlquestion: print w/ replies, xml ) Need Help??
nextGenKid has asked for the wisdom of the Perl Monks concerning the following question:

Dear All,

I have a situation where in i have to fork processs in an perl cgi script and i m not sure how to implement the same to reduce the response time as this is a web script. Basically i m from bioinformatics back ground . So my situtaion is as follows

1 ) i will get Multiple inputs to the main cgi script and the data is stored in an array

2) foreach of the elemnt in the array

2.1 Will parse the input

2.2 use system command to fetch command

2.2.1 the data is written in multiple files

2.3 use system command to run a third party tool for each files from above step

2.4 parse each results

2.5 run a third part tool from tthe above results

2.6 parse the result

3) Show the consolidate results

What i have to do is that for each element in the array (from step 1) they should run ll'ell and the the system command in step 2.3 should also run ll'elly.

How could i implement the same?

Comment on Forking in CGI scripts
Re: Forking in CGI scripts
by Anonymous Monk on Nov 06, 2012 at 09:33 UTC
Re: Forking in CGI scripts
by sundialsvc4 (Monsignor) on Nov 06, 2012 at 14:42 UTC

    Also, if the work to be done is “substantial,” you might not want to implement it in the CGI script at all, for various reasons.   Instead, you might conceive of the CGI script strictly as a user-interface, in which you can queue work-requests (say, they're written to a database table), observe the completion status of requests you've queued, and retrieve the results selectively.   Meanwhile, a back-end processor or processors, say written in Perl, poll this queue and carry out the work.   (They put the results in a directory that’s mutually accessible and note in the database where they put it.)   Now, the CGI system is back to doing what it customarily does:   handling very brief, “get in then get out in a few milliseconds” requests.   The worker-bee daemons can be anywhere.   And results, once generated, can be viewed more than one time at will.   How you devise the daemons to work, <i.e. forking or not, is entirely up to you and no longer of any concern to CGI.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1002431]
Approved by Corion
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (8)
As of 2014-04-19 10:27 GMT
Find Nodes?
    Voting Booth?

    April first is:

    Results (480 votes), past polls