Martin90 has asked for the wisdom of the Perl Monks concerning the following question:
Hello monks,
I have a website which allows young music producer and artist to share their music with others. In CGI script I use open function to pass files to download :
It works good, but since my website grow this approach starts to consume to much of RAM and I need to look for better way which consume less RAM. I read that there are good and powerful ways like pass files thru system with nginx support but I have no idea at this time how to implement it, and here is my question how to make my download script fast, flexible and light ?my $fsize = -s "$file"; $|++; open(my $fh,"$f->{upload_dir}/$file") || die"Can't open source file"; print qq{Content-Type: application/octet-stream\n}; print qq{Content-length: $fsize\n}; #print qq{Content-Disposition: attachment; filename="$fname"\n}; print qq{Content-Disposition: attachment\n}; print qq{Content-Transfer-Encoding: binary\n\n}; $speed = int 1024*$speed/10; my $buf; while( read($in_fh, $buf, $speed) ) { print $buf; select(undef,undef,undef,0.1); }
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Good method to pass files throught perl/cgi script ?
by daxim (Curate) on Jul 10, 2013 at 11:01 UTC | |
by Martin90 (Sexton) on Jul 10, 2013 at 15:04 UTC | |
Re: Good method to pass files throught perl/cgi script ?
by thomas895 (Deacon) on Jul 10, 2013 at 23:12 UTC | |
by Martin90 (Sexton) on Jul 11, 2013 at 10:17 UTC | |
by thomas895 (Deacon) on Jul 11, 2013 at 23:24 UTC |
Back to
Seekers of Perl Wisdom