I've been searching the web and playing with my code for a while now and can't find a solution to this problem.
I would like to limit uploads to my site with CGI. I'm setting $CGI::POST_MAX = 1024 * 1024 * 20; for a 20 Mb limit, but when I test it with a 900 Mb file, the upload goes through just fine. I've tried every modification I can think of including moving the $CGI::POST_MAX statement around, changing use CGI; to use CGI qw( :standard ); and updating CGI.pm using CPAN. I now have version 4.38 of CGI.pm installed. I would welcome any ideas to be able to protect my server from DDoS attacks.
Here is my code snippet:
#!/usr/bin/perl -T
my $version = "0.1";
use strict;
use warnings;
use CGI;
use CGI::Carp qw/fatalsToBrowser/;
$CGI::POST_MAX = 1024 * 1024 * 20; # maximum upload filesize is 20M
$| = 1;
my $q = CGI->new;
print $q->header;
print $q->start_html(-title=>"mysite",
-style=>{-src=>'/plasma/style.css'});
print $q->start_multipart_form(-method=>'POST',
-action=>'/cgi-bin/start_analysis.cgi')
+;
print $q->filefield(-name=>"uf_core",
-size=>20);
print "\tFile type: ";
print $q->popup_menu(-name=>"urt",
-values=>['fasta','genbank'],
-default=>'fasta');
print "<br><br>";
print qq{<input type="file" name="uf_qry" size="20" multiple="true" />
+};
print "<br>";
print $q->submit(-value=>"ANALYZE!");
print $q->end_multipart_form;
print $q->end_html;
Thanks!