Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
Dealing with corrupt db_file filesby gossamer (Sexton) |
on Jan 16, 2013 at 03:03 UTC ( [id://1013484]=perlquestion: print w/replies, xml ) | Need Help?? |
gossamer has asked for the wisdom of the Perl Monks concerning the following question: Hi, I'm a novice perl programmer and have written a set of functions that use DB_File write to a hash that consists of a filename and some of its contents. It's actually quarantine files from amavisd-new and some info about the files, such as the subject, spam score, etc. I've been using these routines for quite some time, but every once in a while, searching through one of the db files causes my scripts to just hang. Is there a known problem with corruption, perhaps caused by locking, with db_file? For example, when I use the typical routines to scan through the hash:
The 170 represents 'aa', so in this case I'm trying to read aa.db. I've hard-coded it here, where I normally have a loop that iterates through 256 db files. I've just done it here for brevity and the aa.db file is the one with the problem. On occasion, the script will just hang after tieing to the db on the foreach line. I don't know how the file gets corrupt, but recreating it from the source of all the amavisd-new quarantine entries for that bucket fixes the problem. So, what would cause the script to hang when trying to process the foreach line? I've tried several other ways, including while loops, to process the hash, and they also lock up at that point. The script that creates the hash is much more involved, so I've not posted that here for now. Any ideas greatly appreciated.Thanks, Dave
Back to
Seekers of Perl Wisdom
|
|