http://www.perlmonks.org?node_id=447290

spyders has asked for the wisdom of the Perl Monks concerning the following question:

Sort of repost and something that was discussed earlier in the CB..

I have a small www::mech script that I'm trying to get to work on my web site (CGI). It works in command line just fine but as CGI on my web it it continually reloads infinately and oddly enough it appends my URL into the current URL (who knows why.).

It turns http://spydersubmission.com/cgi-bin/lyrics.pl (this is the real link)
Into http://spydersubmission.com/?http://spydersubmission.com/cgi-bin/lyrics.pl

Someone in the CB said it was working fine as CGI for them. So what could the problem be?

The module wasn't there this morning so I had to download, make a lib and upload. So I thought that's what the problem was. The web host just installed downloaded and installed it themselves and it's still doing the same monkey business.

Can anyone explain this mystery? Does it work on your web server?

#!/usr/bin/perl use warnings; use strict; use CGI qw/:standard/; use CGI::Carp qw(fatalsToBrowser); my $url = "http://www.letssingit.com/frame_menu.html"; my $query = "avril"; use WWW::Mechanize; my $mech = WWW::Mechanize->new(agent=>"wonderbot"); $mech->get($url); print header, start_html("lyrics grabber"); $mech->submit_form( form_name => 'search', fields => { s => $query, l => 'song', # or 'artist' or 'albu +m' or 'lyrics'. }, ); my $content = $mech->content; print $content;