Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Re^2: Running CGI script within another cgi script

by Anonymous Monk
on Nov 27, 2009 at 18:58 UTC ( [id://809795]=note: print w/replies, xml ) Need Help??


in reply to Re: Running CGI script within another cgi script
in thread Running CGI script within another cgi script

Hi bellaire,

Thanks for your reply. It works. I asked this question cause, I an writing a CGI script which calls a function, thsi function takes 4 minutes to complete and due to which the page is getting expired. So, I tried something like below:

use CGI qw /:standard/; <meta http-equiv="refresh" content="2;url=some-html-file"> --tried red +irecting to a different page function-call(); main result write to a file and then rename to some-html-file. So. tha +t finally I get result and page won't expire.
Concept looks neat but when I run the CGI script, it won't redirect...instead it directly calls the function and expires and once it expires then the redirection happening.

Is there any thing I am doing wrong? Or this is the expected behavior? Any thought on this? Thanks.

Replies are listed 'Best First'.
Re^3: Running CGI script within another cgi script
by almut (Canon) on Nov 27, 2009 at 19:42 UTC

    You need to run the long running function in the background, so the CGI script will terminate right away. This way, the <meta http-equiv="refresh" ... will also be sent right away, independently of any buffering that might happen on the way from the CGI script/webserver to the browser... and the browser will then request some-html-file after the specifed period (which, btw, should be at least 4 minutes (if the function call takes that long), not 2 secs...).

    In case the webserver is Apache running on something unix-ish, you'd fork (and if needed exec) a child process to run something in the background. But don't forget to close (or redirect to file) STDOUT and STDERR in the child before you start the long running task. Otherwise Apache will wait on the other end of the pipes setup to the CGI process until both the parent CGI and its child have closed the handles — meaning the initial request would wait/hang for as long as the function call runs (STDIN doesn't need to be closed, because Apache is on the sending side of the pipe and closes itself).

      Hi almut, Is this the concept you are talking about ... http://www.stonehenge.com/merlyn/LinuxMag/col39.html Thanks
      Thanks almut for pointing me to the right way. One last question ... You said in *nix fork/exec the child process and the concept also does the same ... but I am working on windows perl ... will the same thing work fine in windows too. I believe it will ..

        I, too, believe it will work :)  But I don't have any first hand experience, as I've never (had to) run Apache on Windows...  I suppose Perl's fork emulation on Windows should be okay, but I'm not sure how Apache behaves in the respect in question (this is not to say it wouldn't work — I just can't tell).  In case of doubt, I would just try and see... :)

      Hi almut,

      It still expiring ... I am doing like below

      if (my $pid = fork) { # parent does #delete_all(); # clear parameters print redirect('http://localhost/test.html'); } elsif(defined $pid) {close STDERR;close STDOUT;function_call();exit 0; +} else {die message;}
      Now, also it's not redirecting and at last getting expired. How should I troubleshoot further. Please help. Thanks

        Generally looks okay to me... except for the print redirect('http://localhost/test.html'); which would redirect without delay, so that the browser would request the output page more or less immediately, when it hasn't been written yet completely (I'm supposing the test.html is being produced by function_call(), as you mentioned before).

        This would either lead to partial/premature content being returned, or - more likely on Windows (AFAIK, the file would be locked by the writing process) - the webserver would not be allowed to read it at all, before the long-running process has finished writing the file...   Why not stick with the http-equiv="refresh" (with a delay) that you had tried originally?

        BTW, are you sure it's the original request that hangs, or is it the new redirected one?

Re^3: Running CGI script within another cgi script
by bellaire (Hermit) on Nov 27, 2009 at 19:15 UTC
    It's because the web server is waiting for more output from your script before it honors the redirect. You have to tell the web server you're done with output. One way would be to close STDOUT:
    print CGI->redirect(...); close STDOUT;
    Another is to set your buffers to autoflush and send an End-Of-Transmission character:
    $|=1; print CGI->redirect(...); print chr(4); # EOT
    I've never actually had to do something like this, so others might have a more elegant solution.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://809795]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (2)
As of 2024-04-26 04:19 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found