http://www.perlmonks.org?node_id=318476

pnguine has asked for the wisdom of the Perl Monks concerning the following question:

Hi I'm tyring to make myself a robot to look for things on the web in a more specific manner than google, etc. I'm using borrowed code to start as I don't know much about this and there doesn't seem to be much on the net about it. This is the code so far:
#!perl #ORIG# SMiTZ's Big Bad Support Crawler #ORIG# crawls support.intel.com, reading title and meta description ta +gs, #ORIG# storing these in a lovely database. # # v0.1 use strict; use warnings; use WWW::Robot; #ORIG# use DBI; #probably use this later #ORIG# use Data::Dumper; #think it goes with the above #ORIG# my $docRoot = 'http://support.inte­l.com'; #see next line my $docRoot = 'http://www.linuxjournal.com/'; my $databaseName = 'spider1'; #leave here, it shouldn't be a probl +em my $databaseTable = 'index'; #see above # Connect to DB # my $dbh = DBI->connect("DBI:OD­BC:$databaseName", {RaiseError=>1}) o +r die ("Couldn\'t connect to database: $DBI::errstr; stopped"); my $robot = new WWW::Robot( 'NAME' => 'penguine Bot', 'VERSION' => 0.1, 'EMAIL' => 'pnguine@hotmail.com', 'VERBOSE' => 0, #NAUGHTY BOY!!! Change ASAP 'DELAY' => 0, 'IGNORE_TEXT' => 0); #ORIG# $robot->proxy('http'­, 'http://***.***.inte­l.com:911'); # d +amn firewall probably not needed $robot->addHook('follow-url-test', \&follow_url_test); $robot->addHook('invoke-on-contents', \&invoke_on_contents); $robot->run($docRoot); #ORIG# $dbh->disconnect(); #-------------------­--------------------­--------------------­------- +-------------­------------ # Hook subs sub follow_url_test { my ($robot, $hook, $url) = @_; return 0 unless $url->scheme eq 'http'; return 0 if $url =~ /\.(gif|jpg|png|xbm|au|wav|mpg|doc|xml|ppt)$/; return $url =~ /^$docRoot/; } sub invoke_on_contents { my ($robot, $hook, $url, $response, $structure) = @_; return unless $response->content_type eq 'text/html'; my $desc = $response->header("X­-meta-keywords"); $desc = 'none, none, none, none, none' unless ($desc); my $title = $response->header("title"); my @desc = split(/,/, $desc); # the following block is an attempt at pod syntax # =head1 COMMENTS # $dbh->do(q{ # INSERT INTO index (url, title, description1, description2, des +cription3, description4, description5) # VALUES (?, ?, ?, ?, ?, ?, ?) # }, undef, $url, $title, $desc[0], $desc[1], $desc[2], $desc[3], $ +desc[4]) # or die $dbh->errstr; # =cut print "*"; } # I'll try my first run like this and see what happens - how do I +make this executable in windoze?
....... I get this error when I try to run it:

WWW::Robot: failed to create User Agent object: LWP::RobotUA from address required at E:/Perl/site/lib/WWW/Robot.pm line 1160

Can't call method "addHook" on an undefined value at myrobot.pl line 32.

Thanks for any help - I'm lost at this point. BTW - Win98/ActivePerl

Replies are listed 'Best First'.
Re: a WWW::Robot robot ?
by castaway (Parson) on Jan 03, 2004 at 10:41 UTC
    A first glance at WWW::Robot shows that it hasnt been updated since 1997, LWP has, however.. I downloaded and tested it (there are no 'make test' tests, silly module), there is however an example program, which also does not run. I ran it through perl debug, it seems that WWW::Robot attempts to create a new LWP::RobotUA object, passing it 'NAME' and 'FROM' as parameters, the LWP::RobotUA requires that the second parameter, which is supposed to be an email address, contain an @ sign, so it dies.

    In short, WWW::Robot is broken. This has already been reported on the CPAN bug tracker, but not yet fixed, see: here.

    (oops, my mistake, WWW::Robot has a newer version, last released in Nov 2002, but that is still broken.)

    C.

      Thanks very much - being of the uninitiated I would have been tearing my hair out for weeks trying to get it to go.

      Any suggestions on another direction to take to do the same thing with perl?

      Phil N

        Hi By accident i was looking at someone 's code who had used WWW::Robot to make it do something and his code magically worked(http://www.stonehenge.com/merlyn/WebTechniques/col42.html)

        I realized that he was also using "USERAGENT => LWP::UserAgent->new," when creating an instance of the WWW::Robot. Am not sure if this really works cause i havent writtenm any more code to test for sure(am off for sleep now anyway :)) but it has taken the error message the user above was writing about which i was getting too. Hopefully this is right.