http://www.perlmonks.org?node_id=810676


in reply to RFC: A new module to help avoid running multiple instances of the same script (via cron, for example)

At that point, I decided to actually store the job's process id in the lock file.

The OS is better at controlling concurrency because it can do clean up for dead processes. Whether the file is locked or not could be used to indicate concurrency.

package Cron::AvoidMultipleRuns; use strict; use warnings; use Cwd qw( realpath ); use Errno qw( EWOULDBLOCK ); use Fcntl qw( LOCK_EX LOCK_NB ); use File::Spec::Functions qw( rel2abs ); my $lock_file; my $lock_fh; { $lock_file = realpath(rel2abs($0)) . '.lock'; open($lock_fh, '+>>', $lock_file) or die("Can't create lock file \"$lock_file\": $!\n"); if (!flock($lock_fh, LOCK_EX|LOCK_NB)) { undef $lock_fh; if ($! == ($^O =~ /Win32/ ? 33 : EWOULDBLOCK)) { die("Another instance of this program is running. Exiting. +\n"); } else { die("Cannot lock lock file \"$lock_file\": $!\n"); } } } END { if (defined($lock_fh)) { undef $lock_fh; # Release lock unlink($lock_file) ;#or warn("Can't unlink lock file \"$lock_file\": $!\n"); } } 1;

You had a bug where the lock can be defeated (accidentally or otherwise) by using a symlink to the script. It's fixed by realpath(rel2abs()) in my code.

die is overkill (punny!) for errors in removing the pid file. In fact, even a warning is sounds unnecessary to me.

An even better method for Windows would be to create a named mutex instead of creating a file. That way, it gets cleaned up automatically.

Alternatively, You could simplify the code a lot by simply locking the script.

package Cron::AvoidMultipleRuns; use strict; use warnings; use Errno qw( EWOULDBLOCK ); use Fcntl qw( LOCK_EX LOCK_NB ); our $lock_fh; { open($lock_fh, '<', $0) or die("Can't open script \"$0\": $!\n"); if (!flock($lock_fh, LOCK_EX|LOCK_NB)) { undef $lock_fh; if ($! == ($^O =~ /Win32/ ? 33 : EWOULDBLOCK)) { die("Another instance of this program is running. Exiting. +\n"); } else { die("Cannot lock script \"$0\": $!\n"); } } } 1;

I really dislike the name of the module. The module has nothing to do with cron and is not just useful for cron scripts, for starters. Then there's the problem that you don't want to prevent multiple runs. You want to prevent simultaneous runs.

A useful improvements would be to allow the caller to specify a lock file name if he's not happy with the default.

  • Comment on Re: RFC: A new module to help avoid running multiple instances of the same script (via cron, for example)
  • Select or Download Code