I've inherited a system that downloads bioinformatics data from various sites using Net::FTP. There are many scripts that use ftp to examine files and download those that are new and desired. All this works fairly well, but I've been asked to add a way to limit the bandwidth used by our downloads.
There are many cases of code like this:
$f_curdir = $ftp1->pwd();
$rc = $ftp1->cwd($dir_to_use);
$ftp1->binary;
@new_remote_list = $ftp1->dir(".");
It's looking like I'll need to replace all uses of ftp with either wget or cURL, and that's looking like a lot of work to gain one feature. Can you think of a way to limit the bandwidth use of these downloads without replacing the use of Net::FTP ?
-
Are you posting in the right place? Check out Where do I post X? to know for sure.
-
Posts may use any of the Perl Monks Approved HTML tags. Currently these include the following:
<code> <a> <b> <big>
<blockquote> <br /> <dd>
<dl> <dt> <em> <font>
<h1> <h2> <h3> <h4>
<h5> <h6> <hr /> <i>
<li> <nbsp> <ol> <p>
<small> <strike> <strong>
<sub> <sup> <table>
<td> <th> <tr> <tt>
<u> <ul>
-
Snippets of code should be wrapped in
<code> tags not
<pre> tags. In fact, <pre>
tags should generally be avoided. If they must
be used, extreme care should be
taken to ensure that their contents do not
have long lines (<70 chars), in order to prevent
horizontal scrolling (and possible janitor
intervention).
-
Want more info? How to link
or How to display code and escape characters
are good places to start.
|