Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight

Blank data file

by Anonymous Monk
on Jul 09, 2002 at 21:01 UTC ( [id://180607]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi all, The following code is running on a high traffic site:
open(file, "$mailclicks"); flock(file, 2); @recs = <file>; flock(file, + 8); close(file); $i = 0; foreach $rec (@recs) { $rec =~ s/\n//g; ($id,$description,$need,$received) = split(/::/,$ +rec); if ($id eq $formdata{'id'}) { $received++; $recs[$i] = "$id\:: +$description\::$need\::$received"; } $i++; } open(file, ">$mailclicks"); flock(file, 2); foreach $r (@recs) { print file "$r\n" unless ($r eq "" || $r eq " +\n"); } flock(file, 8); close(file);
This increments a tracking counter for use by another script. The problem is that the data file goes blank randomly. Other scripts access the file but I have isolated it to this snippet. I haven't been able to find a pattern or reproduce the problem myself. It generally takes 100 or so users before it happens. No corruption of the file along the way. I'm hoping this is a common problem in perl? Thanks much, Ian

Replies are listed 'Best First'.
Re: Blank data file
by dws (Chancellor) on Jul 09, 2002 at 21:12 UTC
    This increments a tracking counter for use by another script. The problem is that the data file goes blank randomly.

    You have a timing window in the code. Consider what happens if you execute   open(file, ">$mailclicks); immediately before another process obtains a lock. They now read (and will later write) a blank file.

    You might want to expand the lifetime of the lock to encompas the entire process.

Re: Blank data file
by kvale (Monsignor) on Jul 09, 2002 at 21:15 UTC
    I have seen this problem before, due to multiple processes trying to read and write files simulataneously. Here is a more cautious style of locking using flock that hasn't let me down yet (probably adapted from the Cookbook, I can't remember):
    # reading open NEWSRC, "< $newsgroup_file" or die "Could not open newsgroup file $newsgroup_file\n"; unless (flock NEWSRC, LOCK_SH | LOCK_NB) { print "Contention: cannot get a shared lock the file, blocking ($!) +...\n"; unless (flock NEWSRC, LOCK_SH) { die "ERROR: cannot get a shared file lock on $newsgroup_file: $! +\n"; } } print "Initial newsrc read lock granted\n"; my @news_lines = <NEWSRC>; close NEWSRC; # writing open NEWSRC, "+< $newsgroup_file" or die "Could not open newsgroup file $newsgroup_file\n"; unless (flock NEWSRC, LOCK_EX | LOCK_NB) { print "$group ($$): Contention: cannot get an exclusive lock the fi +le,\n blocking ($!) ...\n"; unless (flock NEWSRC, LOCK_EX) { die "$group ($$): ERROR: cannot get an exclusive file lock on $n +ewsgroup_file: ($!)\n"; } } print "$group ($$): Write lock granted\n"; print NEWSRC @news_lines or die "Disk full\n"; close NEWSRC;
Re: Blank data file
by Aristotle (Chancellor) on Jul 09, 2002 at 22:15 UTC

    Your problem is open(file, ">$mailclicks"); It will clobber the file before it gets the lock.

    Also, you are doing a lot of open and flocks without checking their results - which means you'll keep running with now faulty data that will subsequently be used to touch files it should never reach.

    Not to mention you have a race condition because you slurp the file but then unlock and close it. Then you reopen it and write back your results. In the meantime, another process may update the data, causing your updates to become outdated.

    Also, revoking the lock before closing is a bad idea (I can't quite remember where it is, but merlyn had an excellent writeup on it) - closing the file will revoke the lock anyway so just leave well enough alone.

    use Fcntl qw(:DEFAULT :flock :seek); sysopen FILE, $mailclicks, O_RDWR | O_CREAT or die "couldn't open $mai +lclicks: $!"; flock FILE, LOCK_EX or die "failed acquiring lock on $mailclicks: $!"; my @slurp = <FILE>; # do something with @slurp seek FILE, 0, SEEK_SET; # go back to top of file print FILE @slurp; truncate FILE, tell FILE; # because the file may be smaller than it wa +s before close FILE;

    Note you should use chomp rather than s/\n//.

    But what is your loop doing there? I see a lot of very roundabout ways for doing things. First of all, you iterate over @rec. But you want to modify the current element in the if block, so you keep track of the current element in $i. That is entirely unnecessary: the loop variable, $rec in your case, is an alias to the current element. If you modify it, the actual array element will be modified. In your case you can skip the $i business and simply write $rec = rather than $recs[$i] =.

    Another thing I don't understand is why you glue the splitted values back together. The string you create to assign to $recs[$i] is exactly what you had at the beginning of the loop. You could just write $rec and get the same result.

    Now the funny part is we have simplified that assignment to $rec = $rec.

    All that happens within the if is that $received gets updated. And since you did not modifiy the initial data, all that happens when you write the file back is that empty lines are removed. For that result, you're doing far too much work.

    Since all you use is the $id part, you you can also leave out the chomp entirely and limit your split to return only the part up to the first ::.

    Your entire code could be simplified to the following:
    use Fcntl qw(:DEFAULT :flock :seek); sysopen FILE, $mailclicks, O_RDWR | O_CREAT or die "couldn't open $mai +lclicks: $!"; flock FILE, LOCK_EX or die "failed acquiring lock on $mailclicks: $!"; my @records = <FILE>; seek FILE, 0, SEEK_SET; for(@records) { # now the line will be in $_ rather than $rec next if $_ eq "\n"; # skip empty lines completely my ($id) = split /::/, $_, 2; ++$received if $id eq $formdata{id}; print FILE $_; } truncate FILE, tell FILE; close FILE;

    Much clearer, and no race conditions.

    From the example you post it also appears as though you're using neither strict nor warnings. That's a bad idea; you're inviting subtle and hard to find bugs caused by typos that Perl would easily spot for you.

    Makeshifts last the longest.

      First off, thanks all.

      Aristotle... I appreciate your comments but I'm not really a perl programmer and I don't particularly want to be. ;-) When I write server side scripts from scratch I use PHP and I tend to enjoy it. :-)

      The code I gave is modified from the file access routines that an existing set of scripts was already using. I'm aware it's redundant, server intensive, and generally low end. What you saw was only the beginning. The entire member database is flat file.

      The client, however, didn't ask for a rewrite (just a new script to do X, X and X) and they definitely didn't pay for one.

        <rant mode>
        While it's a shame that you don't want to learn Perl, that's no excuse to write crappy code.
        </rant mode>

        On the other hand, it would be best (in future projects) to:
        use strict; use warnings;

        <rant mode again>
        Bad programming practices are bad programming practices no matter what language you're coding in. To quote one of the other monks: "Only Bad Coders Code Bad Perl." There is a nice tutorial on file locking here that would be of some great use to you in future projects. Please! Please! Please! Read up on files and file locking before writing code for "clients." I've already been bitten by not handling file locking properly, so I suggest doing some more in-depth reading, even if you're not going to become an uber Perl coder.
        </rant mode>

        Theodore Charles III
        Network Administrator
        Los Angeles Senior High
        4650 W. Olympic Blvd.
        Los Angeles, CA 90019
        323-937-3210 ext. 224
        perl -e "map{print++$_}split//,Mdbnr;"
Re: Blank data file
by TexasTess (Beadle) on Jul 09, 2002 at 22:04 UTC
    If it's used on a busy server why don't you add a sub program that waits if the file is busy by using the sleep commmand. Also, is there a reason why you don't use the "or die" option for handling an error?
    open (DISFILE,"<".$datfile) or die "Can't do it"; &lock_files(DISFILE); ---PERFORM CALCULATIONS OR WHATEVER-- &Unlock(DISFILE); close DISFILE; sub lock_files{ my $timeout = 0; my $SomeFile = shift; until(flock($SomeFile,2)){ sleep .20; if(++$timeout > 50){ print "Server Time out Message"; }#end if }#end until }#end sub lock files sub Unlock{ my $SomeFile = shift; flock($SomeFile,8); }#end unlock

    "Great Spirits Often Encounter Violent Opposition From Mediocre Minds" --Albert Einstein
Re: Blank data file
by rah (Monk) on Jul 10, 2002 at 01:03 UTC
    All of these file locking schemes, leave open the possibility that another script (or instance of this script) could open the file before the lock is obtained. If that happens someone is going to get a blank file. You might consider using a semphore file for your lock mechanism. Take a look at the excellent article by Sean Burke in TPJ (Sysadmin Mag) from Feb 2002. He describes a very similar scenario and how he solved it with a semaphore.
Re: Blank data file
by Anonymous Monk on Jul 10, 2002 at 04:05 UTC

    Free advice: (Not worth the electrons its displayed with.)

    If I understand you correctly, your a PHP program who has been contracted to do some (PHP) work on a system. You have discovered that you have a dependancy on some Perl written by someone else at someother time and you need to modify that (apparently rather badly written) code. You are trying to get advice as to the best way to achieve your ends with having to try and re-write the whole system cos you wouldn't want to, and wouldn't get permission/paid to do it anyway.

    1). Use the code snips below and use their best advice to do the minimum work that you need to.

    2) Print off a copy of the rants and raves herein and keep them till your project is finished. THEN, take a copy of the original script you modified and contrast it with your new version to the man that pays the bills. Point out that the previous (Perl) programmer employed did not do a good job as testified to by the rants above and, if you feel up to doing the job, (in Perl if you can, PHP if you know that you can do that but not the Perl) give the man a realistic quote for doing the work.

    You just might get some extra work at a time when many are struggling to find some.

    Finally, if your schedule, your knowledge or your desire prevent you from doing the work, you could suggest a finders fee for locating a (decent) Perl programmer to do the job and bring the specs along here. I'm sure that you could find a willing, competent man hereabouts that would be grateful for the hand-me-along.

    Good luck in your project.


      Yours was the, possibly, the only intelligent reply after Aristotle, thanks ;-)

      Ok, to clarify, this thread answered my question early on and everything is working spiffy now.

      It's amazing how diamond sphinctered programmers can be. I could have sworn that somone just called me a bad programmer! All gleaned from this thread? Amazing intuition or not enough time in the sun? ;-)

      About going back with improvement suggestions. Not a bad idea but not a possibility with this client. People that buy scripts like this don't have the budget to have them rewritten.

      Besides, it's a big ugly mess that wouldn't be any fun to clean up. :-)

      About struggling for work... Actually I have had a backlog of jobs for the last couple of months. I've stopped advertising. Not only that, programming and design somehow manages to get more fun, instead of less, over the years.

      I think perhaps I will strive to become an even worse (by the diamond sphincter definition, of course) programmer in the future ;-)

      Thanks again for answers all.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://180607]
Approved by dws
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (5)
As of 2024-06-19 19:43 GMT
Find Nodes?
    Voting Booth?

    No recent polls found

    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.