Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Writing to one file from multiple processes

by KianTern (Acolyte)
on Mar 06, 2008 at 09:02 UTC ( #672403=perlquestion: print w/ replies, xml ) Need Help??
KianTern has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,
I'm wiring a multipart (4-5 forked processes) Linux daemon which off course needs logging.
I've created a simple class which I use in each daemon part.
This is not very convenient to use 4-5 log files for one daemon.
I want to use only one file to write all the logging info, but I'm afraid of a situation where 2 parts will try write simultaneously to the same file.
I've read that this can be avoided by flock() but it also says that flock() works on file handles, which in my case are unique in each process(class instance).
Can you please help me to find a convenient way to lock the file before writing so it wouldn't be possible to access from another process.
PS. I want to avoid creating a lock file if possible.
Thanks.

Comment on Writing to one file from multiple processes
Re: Writing to one file from multiple processes
by Anonymous Monk on Mar 06, 2008 at 09:27 UTC
    use flock

    "flock" is Perl's portable file locking interface, although it locks only entire files, not records.

Re: Writing to one file from multiple processes
by moritz (Cardinal) on Mar 06, 2008 at 09:28 UTC
    There are many log handlers on CPAN, I'm sure some of them will fit your needs.

    If you want to do it manually it's probably good to set up a logging process that listens on a local socket, and all other process connect to that process. It can spawn a thread for each connection, assembles the logs in a queue and write it to a file.

Re: Writing to one file from multiple processes
by akho (Hermit) on Mar 06, 2008 at 09:37 UTC
      Thanks for the answers,
      I guess I'll stick with multiple files for now.
      Any way this is a temporary logging system, which I'll be moving it to SQLite soon.
      I just thought if there is a fast and easy solution to this, I'll use it.
      But it seems, it's better to start developing the SQLite solution.
        Actually, that is a very good reason to go for Log::Log4perl from the begining. Switching from file-logging to database logging later is then a snap. As is adding e-mail notification , and so on.
Re: Writing to one file from multiple processes
by samtregar (Abbot) on Mar 06, 2008 at 19:14 UTC
    I want to use only one file to write all the logging info, but I'm afraid of a situation where 2 parts will try write simultaneously to the same file.

    Is this an error log, used for debugging purposes? Or something that will contain important data you'll need to process later?

    If it's the former I'd say don't worry. Writes to a file in append mode will be effectively atomic if they're under 4k (the typical write buffer size). If you write more than that you'll get blocks interspersed between writers if they write at exactly the same time. (This advice does not apply to NFS, according to the open(2) man-page on Linux, btw.)

    Thus you'll only see problems if your writers are very active and are writing very large messages to the logs. If that's the case you're going to hate your SQLite implementation since it's well known to not deal well with high levels of concurrency.

    -sam

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://672403]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (8)
As of 2014-09-16 23:40 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    How do you remember the number of days in each month?











    Results (53 votes), past polls