Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask

Script being caught up, somewhere. Causing infinite loop.

by KyleYankan (Acolyte)
on Feb 02, 2003 at 19:21 UTC ( #232026=perlquestion: print w/replies, xml ) Need Help??
KyleYankan has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I'm in the process of re-vamping my site, and am having a wee bit of trouble. I use a template on my site, and simply grab the data to display out of a .inc file, using PHP. I want to transfer this all over to a perl site, kinda like perlmonks. I really like the idea, plus I'm much, MUCH, better at perl, than PHP. My folders are layed out like so: ./cgi-bin/ ./public_html/ ./content/ ./other_stuff is in the cgi-bin, and all the *.inc's are in the content folder. My problem is that my script goes into an infinite loop, or just hangs. This messed me up, becuase it hangs with the MySQL connection open, and I can't connect more than X times to my server. This means my website goes down when it runs. Here's the code to transfer script. Sorry for obvius mistakes, I'm tired and grumpy. Wrote it late last night, and haven't slept since, thanx to a fire call (I'm a firefighter).
#!/usr/bin/perl #INCLUDE LIBRARIES AND MODULES AS NEEDED #use strict; use CGI::Carp qw(fatalsToBrowser); use CGI; use DBI; #assign values $DBuser = "not"; $DBpass = "telling"; $DBtable= "the"; $DBhost = "world"; #connect to Database $dbh = DBI->connect("DBI:mysql:$DBtable:$DBhost", "$DBuser", "$DBpass" +, { PrintError => 0, RaiseError => 1 } ) || die "Error connecting: $D +BI::errstr!"; print "Content-type: text/html\n\n"; #grab files into @files opendir(DIR,"/home/kyleyankan/public_html/content/") || die "Can' open + dir: $!"; @files = readdir(DIR); close(DIR); #@files = glob("../public_html/content/*.inc"); #run loop, to transfer each file while (@files) { #print filename print $_ . "\n"; #grab file contents open(FILE,"/home/kyleyankan/public_html/content/$_"); @content = <FILE>; close(FILE); while (@content) { $stuff .= $_; } #remove .inc from file $_ =~ s/.inc//; #dump into DB $sth = $dbh->prepare(qq~ INSERT INTO quadrants (title, author, content) VALUES ("$_","KyleYankan","$stuff")~) || die "Can't insert values: $!"; $sth->execute || die "Can't insert values: $!"; } #leave that mysql alone $dbh->disconnect();

Replies are listed 'Best First'.
Re: Script being caught up, somewhere. Causing infinite loop.
by Enlil (Parson) on Feb 02, 2003 at 19:35 UTC
    I think that all you really need to do to remove your infinite loops is change the while to foreach

    (this is just after a quick scan there might be other problems)


Re: Script being caught up, somewhere. Causing infinite loop.
by mattriff (Chaplain) on Feb 02, 2003 at 19:35 UTC
    You are using a while loop, but you need a foreach loop. As it stands, @loop is always true, so your while loop never exits.

    On another note, I think the way you are using (open|read|close)dir is also adding "." and ".." to @files, which probably isn't what you want. You might want to use something like:

    @files = grep /\.inc$/, readdir(DIR);

    UPDATE: enlil is just slightly quicker on the draw today, I guess. ;)

    UPDATE AGAIN: Re: the response below. Careful now -- you very likely can open . and .., and you'll get very weird results.

    - Matt Riffle

      Ah, see? Told ya i'd make a stupid mistake. Thank you very much. I can also do a
      open(FILE,"/home/kyleyankan/public_html/content/$file") || next;
Re: Script being caught up, somewhere. Causing infinite loop.
by Anonymous Monk on Feb 02, 2003 at 19:52 UTC

    Besides using for or foreach for the outer loop, you can replace

    open(FILE,"/home/kyleyankan/public_html/content/$_"); @content = <FILE>; close(FILE); while (@content) { $stuff .= $_; }


    open(FILE,"/home/kyleyankan/public_html/content/$_"); my $stuff = join'', <FILE>;

    Or if your comfortable with the concept

    my $stuff = do{ local *ARGV, $/; @ARGV = "/home/kyleyankan/public_html/content/$_"; <>; };

    Also, you should really be checking the return code from open. eg

    open FILE, "< path/$_" or warn "Error on $_: $!;
Re: Script being caught up, somewhere. Causing infinite loop.
by diotalevi (Canon) on Feb 02, 2003 at 23:44 UTC

    Here's a reworking. Just read it, look for the differences and consider why I made those

    # Use strict and warnings. This is not optional especially when askin +g # for help. use strict; use warnings; use CGI::Carp qw(fatalsToBrowser); # Import some defaults use CGI qw/:standard/; # Explictly prevent any imports since we don't need them. use DBI (); # Slurp all files undef $/; # Put the configuration data right up at the top so you # aren't putting it into your script. use constant DB_HOSTNAME => 'not'; use constant DB_USERNAME => 'telling'; use constant DB_PASSWORD => 'the'; use constant DB_TABLE => 'word'; use constant INC_DIR => "/home/kyleyankan/public_html/content/"; # To format is divine. Just do it. $dbh = DBI -> connect( "DBI:mysql:" . DB_TABLE . ':' . DB_HOSTNAME, DB_USERNAME, DB_PASSWORD, { PrintError => 0, RaiseError => 1 }, ) or die "Error connecting: $DBI::errstr!"; # Don't do HTML when you can have CGI create it for you # (and it comes out nice too) print header(), html_header(); # Read all of the '*.inc' files from the specified # directory my @inc_files = glob INC_DIR . "*.inc"; # This was your infinite loop. You used while when you should # have used for(). for my $filename ( @inc_files ) { print $filename, br; # You didn't check for failure here. A no-no. Also # since I unset $/ the file will be slurped in one shot. # Also, your use of the handle is distressing. First you # read the entire thing into an array. Next you read each # element (after you switch while() for for()) and # concatenate to the $stuff variable (which you didn't # remember to use my() on so it # And during writing these comments I got distracted and # upgraded a server. Guess I'll just post these. open FILE, INC_DIR . $filename or die "Couldn't close $filename: $!"; my $file_contents = <FILE>; close FILE or die "Couldn't close $filename: $!"; # remove .inc from the filename. Originally you wrote # $_ =~ s/.../.../ and that's just redundant. Either you # specify the variable because it's not $_ or just let # the operator take the default. $filename =~ s/\.inc$//; # insert into the database. # 1) add formatting. # 2) remove the 'or die' lines because you already had RaiseError + set for the connection # 3) use place holders # 4) Don't abuse the use of "$variable", ... $dbh -> do( "INSERT INTO quadrants ( title, author, content ) VALUES ( ?, ?, ? )", undef, $filename, "KyleYankan", $file_contents); } #leave that mysql alone $dbh->disconnect();

    Seeking Green geeks in Minnesota

Re: Script being caught up, somewhere. Causing infinite loop.
by KyleYankan (Acolyte) on Feb 02, 2003 at 19:26 UTC
    Sory, one mess-up there. forget to put the DIR's in < code > tags. It loks like this:
    ./cgi-bin/ ./public_html/ ./content/ ./images/ ./other_stuff/
Re: Script being caught up, somewhere. Causing infinite loop.
by castaway (Parson) on Feb 02, 2003 at 19:38 UTC
    Did you try looking at the contents of $dbh->errstr() after your $dbh->prepare(.. ) ?
    And I'm not sure if you can change the value of $_ (using s///), maybe you should copy the contents to another variable first.


Re: Script being caught up, somewhere. Causing infinite loop.
by KyleYankan (Acolyte) on Feb 02, 2003 at 19:46 UTC
    OK, testing it. I hope it doesnt go into another wild crazy loop/ how would i shut it down if it did? The file is called

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://232026]
Approved by TStanley
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (4)
As of 2017-12-17 02:13 GMT
Find Nodes?
    Voting Booth?
    What programming language do you hate the most?

    Results (461 votes). Check out past polls.