Don't ask to ask, just ask | |
PerlMonks |
Re: Ensuring only one copy of a perl script is running at a timeby benizi (Hermit) |
on Dec 20, 2006 at 00:21 UTC ( [id://590826]=note: print w/replies, xml ) | Need Help?? |
Not sure if it's relevant to your situation, but this won't work over (some?) network filesystems. In my case, I have a home directory in AFS. Using either $0 or DATA flock'ing, it prevents two scripts on the same machine from running the same script. But, a script run from another machine proceeds as if nothing's flock'ed. I could see this being an issue if, say, a pool of webservers was serving your files. A more robust solution might be to use a relational database system's locking mechanism. (This is often pretty convenient if you're already using a DB for other tasks.) My favorite method is to create a lock table as follows: the MySQL:
and the Perl
The get_lock and finish_up allow you to detect (via entries in the foolock table) when a script died without finishing. But, in the way it's laid out above, this prevents other processes from then acquiring locks. (Which for my task was desirable.)
In Section
Seekers of Perl Wisdom
|
|