Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

question about LWP and mysterious skipping over of pages

by dchandler (Sexton)
on Aug 24, 2004 at 20:26 UTC ( [id://385502]=perlquestion: print w/replies, xml ) Need Help??

dchandler has asked for the wisdom of the Perl Monks concerning the following question:

Hello. I'm writing a program that records the names of 25000+ users who place pretend bets at a fantasy sports website. I have already written a program that successful extracted all of their names. I then saved this to a file.

Then, to access each player's record, my program begins by setting up an array with an entry for each of the players. By typing in www.blablabla.combla?(PLAYERSNAME) I can get more detailed information about that player. Depending on what sports the player has placed bets on (there are 11 including a record for ALL sports), there will be a link if he has placed bets on those sports. I initially skim the general page to find out what sports he played, then I create an array that captures each sport. THen I proceed to get detailed records for each sport by creating an array of the sports he has played.

My program then loops through the @sports array that contains the sports they've played. Unfortunately, sometimes my program simply does not extract values for a whole loop of the array. I think my regexps are okay, because I can ALWAYS rerun the program from the failed point and successfully and accurately capture values for that iteration. Below is the code (any other recommendations as to better programming design are welcome

$browser = LWP::UserAgent->new(); $browser->agent("Mozilla/4.76 [en] (Win98; U)"); @sports= qw(ALL); open PLAYERS, "players.txt" or die; sub urldefine { + $url= URI-> new('http://www.wagerline.com/showprofile.asp?'); + $url->query_form( 'Member' => $_[0], 'Sport' => "$_[1]",); } while (<PLAYERS>) { chomp $_; push (@plyrs, $_); } open DUMP, ">dump.txt" or die; open STATS2, ">>stats2.txt" or die; for ($i=17022; $i <= 19000; $i+=1) { @sports=(); @urlfeed = ($plyrs[$i], "all"); urldefine @urlfeed; $page=get("$url"); for my $a (split "&Spo", $page) { if ($a =~ /rt=(\w*)"/g) { push (@sports, $1); } } for ($j=0; $j <= $#sports; $j+=1) { print "."; @urlfeed = ($plyrs[$i], $sports[$j]); urldefine @urlfeed; $page= get("$url"); print STATS2 "Player\t", "Sport\t", "Season\t", "BetType\t +"; print STATS2 "Record\t", "Wins\t", "Losses\t", "Ties\t", " +Percent\t"; print STATS2 "Units\t", "Rank\t", "TotalRank\n"; for my $a (split "/td>", $page) { if ($a =~ /al">([AC]\w{2})/m) { $year="$1"} if ($a =~ /al">(\d{4})/m) { $year="$1"} if ($a =~ /"2">([AO].*?)</m) { print STATS2 $plyrs[$i], "\t", $sports[$j], "\ +t", $year. "\t", $1. "\t"; } if ($a =~ /18%"><SMALL>([^<].*?)</m) { print STATS2 $1, "\t"; if ($1 =~ /(\d+)-(\d+)-(\d+)/m) {print STATS2 +$1, "\t", $2, "\t", $3, "\t"} if ($1 eq "N/A") {print STATS2 $1, "\n"} } elsif ($a =~ /18%"><SMALL>.*?>(\d*)\s\w{2}\s(\ +d*)/m){ print STATS2 $1, "\t", $2, "\n"; } } } print "Finished Player #", $i+1,": ", $plyrs[$i], "\n"; }

20040824 Janitored by Corion: Added code tags

Replies are listed 'Best First'.
Re: question about LWP and mysterious skipping over of pages
by ikegami (Patriarch) on Aug 24, 2004 at 21:18 UTC

    Do you see a line "Player\tSport\tSeason\t..." in STATS2 for the skipped loop? If not, that means @sports doesn't get populated as you expect. Either cause the page you downloaded wasn't as you expected it, or because it wasn't downloaded at all. Try checking if $page is defined (and turning on warnings). If that's the problem, you could retry a download that failed.

      I was thinking something along those lines as well. If it were the case that the page wasn't coming in okay, would something like: unless ($page) { redo;} to just repeat the loop be good?
        yup, but you may want to limit the number of times you retry, and maybe pause a second before retrying.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://385502]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chilling in the Monastery: (3)
As of 2024-03-19 03:54 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found