http://www.perlmonks.org?node_id=672403

KianTern has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,
I'm wiring a multipart (4-5 forked processes) Linux daemon which off course needs logging.
I've created a simple class which I use in each daemon part.
This is not very convenient to use 4-5 log files for one daemon.
I want to use only one file to write all the logging info, but I'm afraid of a situation where 2 parts will try write simultaneously to the same file.
I've read that this can be avoided by flock() but it also says that flock() works on file handles, which in my case are unique in each process(class instance).
Can you please help me to find a convenient way to lock the file before writing so it wouldn't be possible to access from another process.
PS. I want to avoid creating a lock file if possible.
Thanks.
  • Comment on Writing to one file from multiple processes

Replies are listed 'Best First'.
Re: Writing to one file from multiple processes
by moritz (Cardinal) on Mar 06, 2008 at 09:28 UTC
    There are many log handlers on CPAN, I'm sure some of them will fit your needs.

    If you want to do it manually it's probably good to set up a logging process that listens on a local socket, and all other process connect to that process. It can spawn a thread for each connection, assembles the logs in a queue and write it to a file.

Re: Writing to one file from multiple processes
by akho (Hermit) on Mar 06, 2008 at 09:37 UTC
      Thanks for the answers,
      I guess I'll stick with multiple files for now.
      Any way this is a temporary logging system, which I'll be moving it to SQLite soon.
      I just thought if there is a fast and easy solution to this, I'll use it.
      But it seems, it's better to start developing the SQLite solution.
        Actually, that is a very good reason to go for Log::Log4perl from the begining. Switching from file-logging to database logging later is then a snap. As is adding e-mail notification , and so on.
Re: Writing to one file from multiple processes
by Anonymous Monk on Mar 06, 2008 at 09:27 UTC
    use flock

    "flock" is Perl's portable file locking interface, although it locks only entire files, not records.

Re: Writing to one file from multiple processes
by samtregar (Abbot) on Mar 06, 2008 at 19:14 UTC
    I want to use only one file to write all the logging info, but I'm afraid of a situation where 2 parts will try write simultaneously to the same file.

    Is this an error log, used for debugging purposes? Or something that will contain important data you'll need to process later?

    If it's the former I'd say don't worry. Writes to a file in append mode will be effectively atomic if they're under 4k (the typical write buffer size). If you write more than that you'll get blocks interspersed between writers if they write at exactly the same time. (This advice does not apply to NFS, according to the open(2) man-page on Linux, btw.)

    Thus you'll only see problems if your writers are very active and are writing very large messages to the logs. If that's the case you're going to hate your SQLite implementation since it's well known to not deal well with high levels of concurrency.

    -sam