http://www.perlmonks.org?node_id=590699


in reply to Ensuring only one copy of a perl script is running at a time

"A workmate asked me the best way to ensure that only one copy of his Perl script is running at a time".

I just developed a perl script that needed to ensure exactly that since it is CGI and can be run my multiple users over the internet and I need to ensure only one conucrrent usage. I simply created a file called "status.txt" ont he server, and had the script overwrite its status on the file contents at start and exit - 0 was disabled, 1 was enabled but not in use, 2 was enabled and in use. I'm sure its not the quickest or best way, but as always with Perl, TMTOWDI and all that, and its simple and works just fine.

It also has the benefit of letting you examine the file contents yourself and deciding on action based on them. You could also use this to set a limit of say 2 concurrent access, or 3 or 4 or... you get the point, by simply incrementing on each script start and decrementing (is that a word?) on each script exit.
  • Comment on Re: Ensuring only one copy of a perl script is running at a time

Replies are listed 'Best First'.
Re^2: Ensuring only one copy of a perl script is running at a time (race)
by tye (Sage) on Dec 19, 2006 at 17:25 UTC

    It sounds like it is time for you to update your computer science knowledge by learning about race conditions.

    I need to ensure only one conucrrent usage [....] works just fine

    It works just fine as far as you have noticed so far. It certainly doesn't "ensure" only one concurrent use; it more like usually prevents more than one concurrent use. (:

    You code must perform the following steps:

    1. Check current status
    2. if not 1 then exit
    3. Set current status to 2
    4. Do work
    5. Set current status back to 1

    And, in a modern computer system, CPU resources are shared, so each process that is serving a CGI request can be interrupted between any of those steps (or in the middle of steps) in order to let some other process do some work for a bit. Two CGI requests coming it at roughly the same time can thus perform those steps in the following order:

    One process Other process my $status= CheckStatus(); exit if 1 != $status; my $status= CheckStatus(); exit if 1 != $status; SetStatus( 2 ); SetStatus( 2 ); DoWork(); DoWork(); SetStatus( 1 ); SetStatus( 1 );

    Note that they both see the status as "1" and both end up running concurrently. This is why operating systems provide locking mechanisms and why you often need to use such.

    - tye        

      You will be happy to learn that I do use flock on opening the file so there is no danger of what you describe happening.

      Rather than locking a file for the duration of the script, I simply lock it while changing its status etc. As posted, this allows me greater flexibility to post more info than just in use/not in use by using the file content to write codes to. Thanks anyway!

      Dan
Re^2: Ensuring only one copy of a perl script is running at a time
by ikegami (Patriarch) on Dec 19, 2006 at 17:32 UTC

    That's wrong. It suffers from a race condition.

    +===============================+===============================+ | Process 1 | Process 2 | +===============================+===============================+ | open the status file | | T | read the status (status is 1) | | i +-------------------------------+-------------------------------+ m | | open the status file | e | | read the status (status is 1) | | | | write 2 to the status file | | | | proceed | | +-------------------------------+-------------------------------+ v | write 2 to the status file | | | proceed | | +===============================+===============================+

    If you use flock, then it's just an extention of what the OP posted.

    ( Oops! Seems like tye posted something similar when I was writing this node. )

    A reply falls below the community's threshold of quality. You may see it by logging in.