Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Background Process

by Anonymous Monk
on Jun 26, 2000 at 00:14 UTC ( [id://19769]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I am writing a CGI in perl that once you complete a form I want a given delay before it emails the perons who filled out the form. If i put a sleep statement into my code, the whole CGI pauses until the sleep is finished. I would like the webpage to complete loading and then the sleep and email response code to be thrown into the background. Is this posible?

Replies are listed 'Best First'.
Re: Background Process
by Aighearach (Initiate) on Jun 26, 2000 at 01:42 UTC
    print $reply_to_user; die "cannot fork: $!" unless defined ( my $pid = fork() ); if ( $pid ) { #this is the parent. exit; } else { #this is the child open STDOUT, '>/dev/null' or die "Can't write to /dev/null: $!"; sleep( 10 ); #do stuff }

     

    update: thanks, Randal. ( btw, you don't have to plug your column each post, you're already famous. ;)

    Additional update:Intrepid: I left STDERR alone because it should be connected to the web server error log, which is probably where you want the errors from a CGI.

    ncw: Well, I am certainly a lover of doing it the easy way! :) Thanks for that, I've been using a home rolled Daemon module, but I see that I don't need to. But, is this a daemon? Proc::Daemon seems to close STDERR, so then we have to reopen it somehow to get the errors back into the webserver logs where we need them.

    Paris Sinclair    |    4a75737420416e6f74686572
    pariss@efn.org    |    205065726c204861636b6572
    I wear my Geek Code on my finger.
    
        Or if you are feeling lazy you can do all the above forking, closing, reopening etc with :-
        use Proc::Daemon; #... Proc::Daemon::Init();
      make it more portable this way:
      use File::Spec; my $devnul = File::Spec->devnull(); print $reply_to_user; die "cannot fork: $!" unless defined ( my $pid = fork() ); if ( $pid ) { #this is the parent. exit; } else { #this is the child open STDOUT, ">$devnul" or die "Can't write to nowhere: $!"; open STDERR, ">$devnul" or die "Can't write to nowhere: $!"; sleep( 10 ); #do stuff }
      This is a suggestion I have not tried, but BTW on WinNT the string value in $devnul is 'nul'.

      Good Meditations,
      soren andersen (Intrepid)

      Q: What is the sound of several hundred Perler-dogs chasing cool code?
      A: YAPC - YAPC - YAPC .. ! :)

RE: Background Process
by httptech (Chaplain) on Jun 26, 2000 at 01:33 UTC
    An easy (non-perl) way to do this if you are using sendmail is to pass the -odq option to sendmail. This causes sendmail to queue the mail for sending later, instead of sending it immediately. Of course, you don't get fine-grained control over how long the delay is. Sendmail will process it at the next queue run interval, which is set in the sendmail.cf file.

    If you wanted to clean the queue out every few minutes instead of waiting for sendmail to do it, you could always set up a cron job to run "sendmail -qSfoo@bar.com" where foo@bar.com is the "From" address on your email.

Re: Background Process
by chromatic (Archbishop) on Jun 26, 2000 at 02:02 UTC
    You could do other things with a separate process:
    • Write the e-mail data to a directory somewhere. Have a cron job run and pass those pseudo-queued messages off to your MTA every 5 minutes.
    • Write another program that uses a named pipe or a socket. Have your CGI program connect to the other, pass the data through and return, and let the other program handle the delay. (Could be tricky if you're sleeping while another instance of the CGI program wants to connect.)
RE: Background Process
by BBQ (Curate) on Jun 26, 2000 at 05:08 UTC
    I usually just reverse the order of things. Print out the html response, then do whatever it is you need to do. In cases such as sending emails, its not that big of an issue to print out first and send the email later, but I would recommend against this if the user has requested to update a database or do something that would require an error message in case it fails.

    This doesn't solve your problem 100% since the browser's status icon will keep moving until the script has actually finished off, but at least you'll get some output first and the user can continue to navigate without having to wait for completion.

    #!/home/bbq/bin/perl
    # Trust no1!
Re: Background Process
by Anonymous Monk on Jun 26, 2000 at 16:27 UTC
    Have a seperate perl script that is continiously running that takes mail via a named pipe, that your in-apache perl process can feed. It can then hold the mail for an arbitrary amount of time before it sends it to sendmail. Why not a fork? Depending on how long you want to hold the mail before sending it, and the amount of mails involved, you could possibly fill your process table, which is definitely not the zen way.
RE: Background Process
by DrManhattan (Chaplain) on Jun 26, 2000 at 18:45 UTC

    If you aren't worried about portability, use at

    - Matt

Re: Background Process
by Maqs (Deacon) on Jun 26, 2000 at 15:22 UTC
    hm. this is not a clean perl solution. but...
    simply write out the file with person's address and continue working with your page
    then have a program launched by crontab (or... task manager) every hm..15 minutes which would check the files in the specified directory, send e-mail and delete the file.
    /Maqs.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://19769]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others exploiting the Monastery: (5)
As of 2024-06-20 11:59 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.