Suppose I have a CGI script running on a public webserver. This script consists of a top-level script and multiple perl module source files. From time to time I improve the script and then I install new versions of all the files. I'd like to make sure that no running instance of the script can see an inconsistent state of the install if it's running at the same time as I'm doing the install.
Currently my install command just copies all the files to the place where they're used from. This means that the CGI script could be started before the install, but load a module later, after the install. This and other races are what I'd like to stop.
I don't care if I have to temporarily make the webpage unavailable for a short period during the install, that is, if a user tries to use the CGI during such an unfortunate time, it's okay if they get a 403 error or a broken connection. It's also okay if the install has to wait for a short time until all already started instances of the CGI exit. It is not okay however to have to stop the whole Apache webserver for the installation (the webserver serves webpages other than this CGI), reboot the machine, or otherwise break some functionality on the host.
I'd like to know if there's an existing good practice to solve this. If there isn't, I can probably invent some new solution, but that would probably be ugly.