Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine

Re: Running CGI script within another cgi script

by bellaire (Hermit)
on Nov 27, 2009 at 18:27 UTC ( #809792=note: print w/replies, xml ) Need Help??

in reply to Running CGI script within another cgi script

It'd probably be best to redirect the user's browser to the target CGI's URL, so that you dont have to worry about nonsense like suppressing its HTTP headers.
use CGI qw/:standard/; print redirect('');

Replies are listed 'Best First'.
Re^2: Running CGI script within another cgi script
by Anonymous Monk on Nov 27, 2009 at 18:58 UTC
    Hi bellaire,

    Thanks for your reply. It works. I asked this question cause, I an writing a CGI script which calls a function, thsi function takes 4 minutes to complete and due to which the page is getting expired. So, I tried something like below:

    use CGI qw /:standard/; <meta http-equiv="refresh" content="2;url=some-html-file"> --tried red +irecting to a different page function-call(); main result write to a file and then rename to some-html-file. So. tha +t finally I get result and page won't expire.
    Concept looks neat but when I run the CGI script, it won't redirect...instead it directly calls the function and expires and once it expires then the redirection happening.

    Is there any thing I am doing wrong? Or this is the expected behavior? Any thought on this? Thanks.

      You need to run the long running function in the background, so the CGI script will terminate right away. This way, the <meta http-equiv="refresh" ... will also be sent right away, independently of any buffering that might happen on the way from the CGI script/webserver to the browser... and the browser will then request some-html-file after the specifed period (which, btw, should be at least 4 minutes (if the function call takes that long), not 2 secs...).

      In case the webserver is Apache running on something unix-ish, you'd fork (and if needed exec) a child process to run something in the background. But don't forget to close (or redirect to file) STDOUT and STDERR in the child before you start the long running task. Otherwise Apache will wait on the other end of the pipes setup to the CGI process until both the parent CGI and its child have closed the handles — meaning the initial request would wait/hang for as long as the function call runs (STDIN doesn't need to be closed, because Apache is on the sending side of the pipe and closes itself).

        Hi almut, Is this the concept you are talking about ... Thanks
        Thanks almut for pointing me to the right way. One last question ... You said in *nix fork/exec the child process and the concept also does the same ... but I am working on windows perl ... will the same thing work fine in windows too. I believe it will ..
        Hi almut,

        It still expiring ... I am doing like below

        if (my $pid = fork) { # parent does #delete_all(); # clear parameters print redirect('http://localhost/test.html'); } elsif(defined $pid) {close STDERR;close STDOUT;function_call();exit 0; +} else {die message;}
        Now, also it's not redirecting and at last getting expired. How should I troubleshoot further. Please help. Thanks

      It's because the web server is waiting for more output from your script before it honors the redirect. You have to tell the web server you're done with output. One way would be to close STDOUT:
      print CGI->redirect(...); close STDOUT;
      Another is to set your buffers to autoflush and send an End-Of-Transmission character:
      $|=1; print CGI->redirect(...); print chr(4); # EOT
      I've never actually had to do something like this, so others might have a more elegant solution.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://809792]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others contemplating the Monastery: (6)
As of 2019-10-17 02:42 GMT
Find Nodes?
    Voting Booth?