Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Best practice for sending results to a user via email

by Anonymous Monk
on Apr 22, 2015 at 20:30 UTC ( [id://1124303]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Dear Monks,
I hereby ask your wisdom on the following problem:
I have set up a simple web-server in PHP, with a submission form (textarea). When the user submits the form, the contents are put into a file and a perl script is being executed on the file. The output of the script is written in a text file at the end, also an image is being produced.
My question has two parts:
1) Because this script takes some time to execute, I think it is not a good practice to just let it run on the web-server, since there is great possibility it will hang and then it will not output anything. So I thought the best is to just get the input from the user and then just send an email to him saying "your work is completed" with a link that will be an HTML page with the results.
Does this sound reasonable practice to you?
2) Can you give me some hints as to how such a thing is accomplished? I mean, what steps should I follow and maybe point me to some examples of sending emails to user that can direct the user to an HTML page with the final results?
Thank you in advance!
  • Comment on Best practice for sending results to a user via email

Replies are listed 'Best First'.
Re: Best practice for sending results to a user via email
by GotToBTru (Prior) on Apr 22, 2015 at 20:41 UTC

    Rewrite the script so it is not so likely to hang or at least returns a message that it could not complete.

    If the script is likely to hang, it is also likely that your user's work is not completed. I doubt they will appreciate getting a message that says it is, when it isn't.

    Mail::Sendmail is probably a good place to start. There are examples of how to use it there, and a Super Search here will almost certainly turn up others.

    Dum Spiro Spero
Re: Best practice for sending results to a user via email
by RonW (Parson) on Apr 22, 2015 at 22:03 UTC

    Starting a long running task from a web server is a reasonable, though unusual, thing to do. I did it once about 10 years ago on a Linux system.

    In my CGI script (in Perl), I included something like:

    # define the task my $task = <<'_EOT_'; # use ' to prevent interpolation #!perl -T # use -T to specify taint mode use strict; use warnings; # (your processing goes here) # (generate and send notification - suggest using Mail::Sendmail) __DATA__ _EOT_ # append the data $task .= join "\n", @data; # save the task open my $tf, '>', $tmpfile; print $tf $task; close $tf; # submit the task system('batch', '-M', '-f', $tmpfile); # (tell user the task has been submitted)

    Note that I used an often neglected feature of Linux/Unix/POSIX - the batch command. This command is usually part of the "at" package, so if batch is not available on your Linux/Unix/POSIX computer, that's what to install.

    FYI, the batch command queues the specified task when the machine is not too busy. Tasks in the queue will be run in the order they were queued.

    If you can't use batch (or similar), try:

    system("nohup $tmpfile &");

    But using batch will reduce the risk of overloading your server.

      Starting a long running task from a web server is a reasonable, though unusual, thing to do. I did it once about 10 years ago on a Linux system.
      Is the task still running? ;-)
      لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ
      Starting a long running task from a web server is a reasonable, though unusual, thing to do.

      Perlmonks standard link for that problem: Watching long processes through CGI

      Alexander

      --
      Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
      "At and batch as presently implemented are not suitable when users are competing for resources. If this is the case for your site, you might want to consider another batch system, such as nqs."
Re: Best practice for sending results to a user via email
by Marshall (Canon) on Apr 23, 2015 at 02:15 UTC
    This is a common problem.

    The user will be patient for about 20-30 seconds if you warn them about this in advance. Without any warning at all, the folks will just give up and cancel the browser session if the time goes past that. That frustrates everybody! They are angry and this burned a lot of Mips on your server to no result. Everybody is mad.

    The idea of not producing a result at all is bad! WOW!
    That is an incredibly bad idea!
    Your program should produce a result, even if that result is "I can't produce a result".

    I sometimes work with some large DB's on line, like the FCC, U.S. Federal Communications Commission (FCC.gov). I can submit a complex DB query and I give them an email to send the reply to. After maybe 30 minutes, I get a URL via email that allows me to download my results. That URL and the data "expires" after some amount of time (24-48 hours or so). This idea works very well for complicated things.

    Of course the best thing is to be performant and generate a quick response, but that is not always possible.

Re: Best practice for sending results to a user via email
by bitingduck (Chaplain) on Apr 23, 2015 at 05:01 UTC

    This is a fairly common practice in sites that generate custom PDFs. I've seen it for book purchases that apply a watermark to the book so it's got your name on it and also on proposal sites where part of submitting a proposal is to generate a pdf from the submitted materials.

    Exactly how you do it depends on the level of security you want for the content generated for the user. The easiest and least secure is to generate a file, write it to the webserver in a place that's accessible to all visitors and then send a link to the user with Mail::Sendmail. You can obfuscate things a little by using a hash function to generate unique and unpredictable filenames and not allowing the directory to be listed, but it's still relatively insecure.

    More secure is to generate the file and store it someplace that your scripts are allowed to read, but isn't accessible in the webserver directory structure. Then you generate the file and store it in the storage directory and record that location in a database. You also create a URL that you store in the same directory database (or that lets you figure out where you stored the user's file, e.g. a url that calls http://example.com/cgi/downloader.pl?file="customdownload.pdf") send the link and a nice note. Then when the user calls your website with that URL it executes the script that passes them the file. If you want it to be secure, you require the user to be logged in to get the file.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1124303]
Front-paged by GotToBTru
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others studying the Monastery: (2)
As of 2024-04-26 03:12 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found