Just learning Perl. I am trying to create a script with LWP module that will grab the html/text from several web pages. Which I can get the info from one web page when I type it in using this;
use warnings;
use LWP::UserAgent;
my $UserAgent = new LWP::UserAgent;
my $Request = new HTTP::Request ('get',
'www.website.com');
my $Response = $UserAgent->request ($Request);
open (FILE, ">/strawberry/perl/file.txt");
print FILE $Response->{_content};
close (FILE);
Now I have also got the code that can take a file and treat it as an array and the entries in the file are the scalars. Maybe my description is off, but this script works.
use warnings;
use strict;
my $file = "/strawberry/perl/website.txt";
open (FH, "< $file") or die "Can't open $file for read: $!";
my @lines = <FH>;
print @lines;
close FH or die "Cannot close $file: $!";
Now I want to combine the two and make the 'get' to loop through every single entry on my website.txt file in the second code and record the information. Something like this;
my $Request= new HTTP::Request ('get','@lines')`or
my $URL=get('@lines')
I have seen a lot of modules (and tutorials) that will will allow people to get info from web sites, but haven't seen anything on accessing multiple websites. I don't care what module I use really, just want to be able to access a large list of web sites that are in a text file.
Any thought? Thanks in advance.