Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid

Complete part of a cgi script before the rest

by e5z8652 (Novice)
on Apr 14, 2010 at 23:09 UTC ( #834793=perlquestion: print w/replies, xml ) Need Help??
e5z8652 has asked for the wisdom of the Perl Monks concerning the following question:

I think I'm just having a bad google day.

This cgi script does three things:

1) Creates a short HTML document that tells the user "your data will arrive in an e-mail!"

2) Kicks off a database query that can take a LONG time. Like long enough that Apache will time out (say 30 minutes for a big query -- it's a 35GB database).

3) Wraps up the query result into an Excel spreadsheet and e-mails it off to the user.

The script is also written in exactly this manner

What I would *LIKE* to see is the script send off the HTML right away. The user gets immediate feedback and goes off to do other things while their e-mail is cooking. Once the HTML has been pushed to their browser, they're done anyway.

What I am *ACTUALLY* seeing is the script waits for the query to complete, then sends off the HTML to the browser.

Is there a way to get a single cgi script to send part of it's output right away while it lets other parts cook?

I guess I could cut it into two different CGI scripts, but I don't want to if I don't have to.

use CGI qw(:standard); use CGI::Carp qw(fatalsToBrowser); use CGI::Carp qw(warningsToBrowser); use CGI::Pretty; use Date::Manip; use DBI; use Net::SMTP; use MIME::Lite; use Spreadsheet::WriteExcel; use strict; use warnings; my $results_ref; my $boundary = 'aecar'.time; # Get the form contents my $HTML_GEN = new CGI; my $request_user = $HTML_GEN -> param('request_user'); my $raw_user = $HTML_GEN -> param('user'); my $raw_date1 = $HTML_GEN -> param('date1'); my $raw_date2 = $HTML_GEN -> param('date2'); my $raw_time1 = $HTML_GEN -> param('time1'); my $raw_time2 = $HTML_GEN -> param('time2'); my $raw_search = $HTML_GEN -> param('search'); my $flag = '0'; # Set MySQL variables: my $host = "db_server"; my $database = "database"; my $tablename = "table1"; my $dbuser = "db_user"; my $dbpw = "db_password"; # Lower case the username, search: $raw_user =~ tr/A-Z/a-z/; $raw_search =~ tr/A-Z/a-z/; # Clean up username, search: $raw_search =~ /^([\w.:\/]*)$/; my $search = $1; $raw_user =~ /^([\w ]*)$/; my $user = $1; # Add underscore if we were accidentally passed a space: $user =~ tr/ /_/; # fix the dates to MySQL format: my $date1 = UnixDate($raw_date1,"%Y-%m-%d"); my $date2 = UnixDate($raw_date2,"%Y-%m-%d"); my $time1 = UnixDate($raw_time1,"%H:%M:%S"); my $time2 = UnixDate($raw_time2,"%H:%M:%S"); # Sanity check on dates, times to keep them in order: $flag = Date_Cmp($date1,$date2); if ($flag>0) { ($date1,$date2) = ($date2,$date1); ($time1,$time2) = ($time2,$time1); }; # At this point, all user input is sanitized to some level, right? A s +mart user can probably still drop the db though. :( # MySQL connection: my $db_use = DBI->connect("DBI:mysql:database=$database;host=$host;po +rt=3306", $dbuser, $dbpw) or die "Database connection not made: $DBI: +:errstr"; # MySQL query syntax: my $user_sql=qq(SELECT date, time, INET_NTOA(ip), url, bytes, authuser + FROM $tablename WHERE authuser = ? AND url LIKE ? AND date BETWEEN ? + and ? AND DATE_ADD(date, INTERVAL time HOUR_SECOND) BETWEEN CONCAT(? +,' ',?) AND CONCAT(?,' ',?) ORDER BY date, time); my $user_query= $db_use->prepare( $user_sql ) or die($db_use->errstr); $user_query->execute($user, "%$search%", $date1, $date2, $date1, $time +1, $date2, $time2) or die($user_query->errstr); # Start an HTML table with column headers: print $HTML_GEN->header(); print $HTML_GEN->start_html(); print $PROXY_GEN->p("An Excel spreadsheet with the requested informati +on will be mailed to $request_user\@domain"); print $PROXY_GEN->end_html; # Fill the spreadsheet data with fetchrow: my $Workbook = Spreadsheet::WriteExcel -> new("/tmp/$user.xls"); my $Worksheet = $Workbook -> add_worksheet("$user"); my $cntr = 1; #(start at 1 because 0 + is the header row) #Create the header row, one cell at a time: $Worksheet -> write (0, 0, "Date"); $Worksheet -> write (0, 1, "Time"); $Worksheet -> write (0, 2, "IP Address"); $Worksheet -> write (0, 3, "URL"); $Worksheet -> write (0, 4, "Bytes"); $Worksheet -> write (0, 5, "User"); while (my @results = $user_query->fetchrow()) { $results_ref = \@results; #Needed to avoid writi +ng out individual elements $Worksheet -> write ($cntr, 0, $results_ref); $cntr++; } $Workbook -> close; $db_use->disconnect; my $message = MIME::Lite->new( From => "$request_user\@domain", To => "$request_user\@domain", Subject => "Report for $user", Type => "TEXT", Data => "Report for $user from $date1 $time1 to $date2 $tim +e2.\n" ); $message->attach( Type => "application/", Path => "/tmp/$user.xls", Filename => "$user.xls" ); $message->send('smtp', '', Timeout=>60);

Replies are listed 'Best First'.
Re: Complete part of a cgi script before the rest
by ikegami (Pope) on Apr 14, 2010 at 23:53 UTC

    Output the HTML, then deamonize.

    Since you mentioned Excel, you might be on Windows. The same idea applies, but the execution is different. Output the HTML, the spawn off a child to do the background work. system(-1, ...) is an easy way to launch a background process in Windows.

    Update: Oops, you might be on Windows! Added second paragraph.

      Oh that was ridiculously easy.

      I did have to fix an error in my script above.

      If you look, you'll see that I execute my query in the line just before the HTML is generated. So at first when I added the daemonize call just after the HTML closed I was still getting the long delays.

      Once I moved all of the MySQL stuff into the same area (below the HTML) it worked perfectly.

      Just one more reason to use subroutines to keep my code clean. I kill myself with messy code. :(

        Just a thought. You may want to consider having the CGI process just queue a request to a daemon to prevent your server/database being overloaded by multiple simultaneous requests. You would simply do an insert into a request queue table with any necessary parameters, and have the daemon periodically check for new requests.

      OK, thanks. I'll experiment with this -- I've got another project going that is using an infinite while loop to read a serial port and dump data to MySQL. That works well, but I think this solution would be perfect for that script as well.

      Actually the server is Linux, but the clients have Windows desktops -- hence the Excel files. Excel can open csv, but I need to eliminate any potential confusion.

Re: Complete part of a cgi script before the rest
by pemungkah (Priest) on Apr 14, 2010 at 23:33 UTC
    The simplest option is to fork off the process that takes a long time so it's a child of init, and then just complete the short part. See here for a way to do that.
Re: Complete part of a cgi script before the rest
by scorpio17 (Abbot) on Apr 15, 2010 at 14:51 UTC

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://834793]
Approved by GrandFather
[Corion]: Mhrrrr. Some project made changes to a report I ingest. I tell them "the format of the new rows is bad". They change it, without understanding/ knowing what other systems might act on this suffix. While it now works for me, I'm not really happy with ...
[Corion]: ... their approach to changing things and waiting who screams at them.

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (11)
As of 2017-07-24 14:04 GMT
Find Nodes?
    Voting Booth?
    I came, I saw, I ...

    Results (354 votes). Check out past polls.