I've inherited a system that downloads bioinformatics data from various sites using Net::FTP. There are many scripts that use ftp to examine files and download those that are new and desired. All this works fairly well, but I've been asked to add a way to limit the bandwidth used by our downloads.
There are many cases of code like this:
$f_curdir = $ftp1->pwd();
$rc = $ftp1->cwd($dir_to_use);
@new_remote_list = $ftp1->dir(".");
It's looking like I'll need to replace all uses of ftp with either wget or cURL, and that's looking like a lot of work to gain one feature. Can you think of a way to limit the bandwidth use of these downloads without replacing the use of Net::FTP ?