|Pathologically Eclectic Rubbish Lister|
conditional testing for error 500 webpages before following them?by fraizerangus (Sexton)
|on Oct 15, 2011 at 13:09 UTC||Need Help??|
fraizerangus has asked for the
wisdom of the Perl Monks concerning the following question:
The website I am trying to scrape from has some links which can't be followed because of a server issue, when I iterate through the links on the page the prgram crashes because of these 'down links'. What is the best way of testing these links first before I follow them and extract the URL?
An example of teh doen link is as follows: http://www.molmovdb.org/cgi-bin/motion.cgi?ID=ppar
Would I need Test::WWW::Mechanize to test before following? Also is it possible to iterate the get($url) in a loop for the links, everything I've tried so far does'nt allow me to do so, it wants an absolute URL?
Using the following code:
many thanks and best wishes