Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:


I want to make a form that can handle uploading large files... it should:

I am running on a shared host where I cannot recompile Apache or add new Apache module or edit the httpd.conf file ... but I can AddHandler for Apache through the control panel. And I can install Perl modules.

Thanks for helping me!

Replies are listed 'Best First'.
Re: large file uploads
by zentara (Archbishop) on Feb 21, 2010 at 11:28 UTC
    I'm not a CGI expert, but you need javascript to show upload progress of anykind from an html upload form. To do the upload all you need is the CGI module( google for 'perl CGI huge file upload'), but the question is the "timing out issue". If you are on shared hosting, they probably have a user limit on hogging apache's time. You should probably switch to ftp for this, especially if you want to resume broken transfers.

    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku
      There's also commonly a maximum size set for file uploads, typically 10MB.
Re: large file uploads
by trwww (Priest) on Feb 21, 2010 at 19:41 UTC


    I've sucessfully set something like this up with FancyUpload.

    There is no way I'd try to do this on a shared host. There is just too much that can go wrong and you not being able to do anything about it.