Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical

Downloading a lot of files

by ghenry (Vicar)
on Jun 23, 2005 at 23:25 UTC ( #469561=snippet: print w/ replies, xml ) Need Help??

Description: Have a list of RPMS or Debs? Stick them in a file, one on each line, and run this. I use it when upgrading between Fedora distros, if any RPMS have been missed according to upgrade.log
#!/usr/bin/perl -w
# Read in upgrade.log and get missing RPMS 
# GH - 22.06.05
use strict;

use LWP::Simple;

open(RPMS, "<", "./upgrade.log")
    or die "File not found or unreadable: $!\n";

my $url = '

while (<RPMS>) {
    print -e $_ ? "Already downloaded $_\n" : "Downloading $_\n";
    next if (-e $_);
    getstore ($url . $_, $_);
    print -e $_ ? "Got $_\n" : "Download failed for $_\n";


print "Download complete.\n";
Comment on Downloading a lot of files
Download Code
Replies are listed 'Best First'.
Re: Downloading a lot of files
by ihb (Deacon) on Jun 23, 2005 at 23:47 UTC

    &mirror could be handy here.

    use strict; use LWP::Simple; use File::Slurp qw/ read_file /; my $upgrade_file = './upgrade.log'; my $url = ' +386/os/Fedora/RPMS/'; for (read_file($upgrade_file)) { chomp; print "Checking $_... "; my $status = mirror($url . $_, $_); print $status == RC_OK ? 'downloaded' : $status == RC_NOT_MODIFIED ? 'already downloaded' : "error: $status"; print "\n"; } print "Download complete.\n";


    See perltoc if you don't know which perldoc to read!

Re: Downloading a lot of files
by davidrw (Prior) on Jun 24, 2005 at 01:40 UTC
    (since TWTOWTDI can include shell utils ;) ) bash syntax (though less error handling, which could partially be fixed with a test of the filename and of $?; could also tweak wget's cmdline options, or use curl instead of wget, etc):
    url=' +/Fedora/RPMS/' for f in `cat upgrade.log`; do wget $url$f; done
    But obviously LWP::Simple is cooler. :)

      That way you lose the feature for persistent connections of wget. I use wget -i upgrade.log if I have a file or open WGET, "| wget -i -" if I don't know the list of URLs beforehand.

Re: Downloading a lot of files
by fmerges (Chaplain) on Jun 24, 2005 at 12:37 UTC

    I know this is about Perl...but, why you don use tools like rsync ( or apt-get.

    Apt-get is wonderfull tool, there're also other tools like apt-get.

    I use rsync for so many tasks, like file syncs, mirroring, etc

    Another really interesting tool is Unison (, for syncronise host in a two directions way.

    Hope it can be useful. Regards


Back to Snippets Section

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: snippet [id://469561]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (3)
As of 2015-10-10 04:37 GMT
Find Nodes?
    Voting Booth?

    Does Humor Belong in Programming?

    Results (254 votes), past polls