Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Good method to pass files throught perl/cgi script ?

by Martin90 (Sexton)
on Jul 10, 2013 at 10:33 UTC ( #1043441=perlquestion: print w/replies, xml ) Need Help??
Martin90 has asked for the wisdom of the Perl Monks concerning the following question:

Hello monks, I have a website which allows young music producer and artist to share their music with others. In CGI script I use open function to pass files to download :
my $fsize = -s "$file"; $|++; open(my $fh,"$f->{upload_dir}/$file") || die"Can't open source file"; print qq{Content-Type: application/octet-stream\n}; print qq{Content-length: $fsize\n}; #print qq{Content-Disposition: attachment; filename="$fname"\n}; print qq{Content-Disposition: attachment\n}; print qq{Content-Transfer-Encoding: binary\n\n}; $speed = int 1024*$speed/10; my $buf; while( read($in_fh, $buf, $speed) ) { print $buf; select(undef,undef,undef,0.1); }
It works good, but since my website grow this approach starts to consume to much of RAM and I need to look for better way which consume less RAM. I read that there are good and powerful ways like pass files thru system with nginx support but I have no idea at this time how to implement it, and here is my question how to make my download script fast, flexible and light ?

Replies are listed 'Best First'.
Re: Good method to pass files throught perl/cgi script ?
by daxim (Chaplain) on Jul 10, 2013 at 11:01 UTC
    You want the sendfile(2) system call, implemented in Apache httpd, lighttpd and nginx with the X-Sendfile header or similar.
      Thanks, is it worth to install nginx also or use just apache ?
Re: Good method to pass files throught perl/cgi script ?
by thomas895 (Chaplain) on Jul 10, 2013 at 23:12 UTC

    Since you don't seem to be doing anything very server-related, why not just put the files in a directory and allow those to be downloaded directly?

    If that is not an option, then consider using a file host or CDN of sorts, where you can upload these files to as they come in.

    "Excuse me for butting in, but I'm interrupt-driven..."
      No, what I am doing is very server-related. I want to have content on my server since I have my own music player + statistic of downloaded files. ALso I would like to control if files was downloaded completely + need to control download speed and so on. So, if I use nginx along with apache and pass files thru nginx with XSendfile, would it help much ? Maybe there are other options ?

        As far as I know, you can't control or even see the download speed in a CGI program. Your script just reads and outputs the whole file to STDOUT, which the server then just passes along to the client. The server probably buffers it, too, so the time it took for the file to be output does not necessarily have to be the time it really took to download. Sendfile won't tell you that, either.

        As far as statistics go, you can write or use a logfile analyser, then count how many times each file was downloaded and by whom, etc.
        Fun fact: you can also add query parameters to the static file locations, and most webservers will not do anything with them by default. Use this to add extra info that you might want to analyse later.

        "Excuse me for butting in, but I'm interrupt-driven..."

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1043441]
Approved by Corion
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (3)
As of 2018-04-20 00:23 GMT
Find Nodes?
    Voting Booth?