Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

www::mechanize reloading page

by spyders (Initiate)
on Apr 13, 2005 at 07:42 UTC ( #447290=perlquestion: print w/ replies, xml ) Need Help??
spyders has asked for the wisdom of the Perl Monks concerning the following question:

Sort of repost and something that was discussed earlier in the CB..

I have a small www::mech script that I'm trying to get to work on my web site (CGI). It works in command line just fine but as CGI on my web it it continually reloads infinately and oddly enough it appends my URL into the current URL (who knows why.).

It turns http://spydersubmission.com/cgi-bin/lyrics.pl (this is the real link)
Into http://spydersubmission.com/?http://spydersubmission.com/cgi-bin/lyrics.pl

Someone in the CB said it was working fine as CGI for them. So what could the problem be?

The module wasn't there this morning so I had to download, make a lib and upload. So I thought that's what the problem was. The web host just installed downloaded and installed it themselves and it's still doing the same monkey business.

Can anyone explain this mystery? Does it work on your web server?

#!/usr/bin/perl use warnings; use strict; use CGI qw/:standard/; use CGI::Carp qw(fatalsToBrowser); my $url = "http://www.letssingit.com/frame_menu.html"; my $query = "avril"; use WWW::Mechanize; my $mech = WWW::Mechanize->new(agent=>"wonderbot"); $mech->get($url); print header, start_html("lyrics grabber"); $mech->submit_form( form_name => 'search', fields => { s => $query, l => 'song', # or 'artist' or 'albu +m' or 'lyrics'. }, ); my $content = $mech->content; print $content;

Comment on www::mechanize reloading page
Download Code
Re: www::mechanize reloading page
by jbrugger (Parson) on Apr 13, 2005 at 09:55 UTC
    If you'd done a
    use Data::Dumper; die Dumper($content);
    You'd see this:
    <SCRIPT> if(self==top)top.location.replace("/?"+document.location) </SCRIPT>
    Got it? :-)

    So this would work:
    #!/usr/bin/perl use warnings; use strict; use CGI qw/:standard/; use CGI::Carp qw(fatalsToBrowser); my $url = "http://www.letssingit.com/frame_menu.html"; my $query = "avril"; use WWW::Mechanize; my $mech = WWW::Mechanize->new(agent=>"wonderbot"); $mech->get($url); print header,start_html("lyrics grabber"); $mech->submit_form( form_name => 'search', fields => { s => $query, l => 'song', # or 'artist' or 'album' or 'lyrics'. }, ); my $content = $mech->content; $content =~ s/self==top//; print $content;


    "We all agree on the necessity of compromise. We just can't agree on when it's necessary to compromise." - Larry Wall.
      THANK YOU THANK YOU THANK YOU THANK YOU THANK YOU!!!

      Made that small s/// and the thing works like a charm! You're so smart! lol

      Thanks!

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://447290]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (3)
As of 2014-07-29 08:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (211 votes), past polls