Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Multiprocess logging mechanism

by Anonymous Monk
on Feb 26, 2002 at 09:55 UTC ( #147517=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

I'm writting a program that forks 4 processes (in Linux). I want all of these processes to log all the work in the same file (ordered by date). Is there a Perl module to do so? If not, I think locking the file is one possibility, but in that case, have I to open+lock+write+close the file at every record to log? or there's a way to maintain opened the file on the 4 processes and only lock to make the write operation.

Replies are listed 'Best First'.
Re: Multiprocess logging mechanism
by clemburg (Curate) on Feb 26, 2002 at 12:34 UTC

    If logging to a *file* is not a requirement, but just an option, you might just as well log to a database. With a decent database system, a lot of problems customarily associated with handling logs just disappear. Still another option might be to use Sys::Syslog or Unix::Syslog.

    Christian Lemburg
    Brainbench MVP for Perl
    http://www.brainbench.com

      Along the same lines is to cheat and simply dump to STDOUT or STDERR and do a redirect from the invoking shell. There shouldn't be any issues with serialization.

      --
      perl -pe "s/\b;([st])/'\1/mg"

Re: Multiprocess logging mechanism
by simon.proctor (Vicar) on Feb 26, 2002 at 10:07 UTC
    Unless you want to mess about with semaphores I suggest you stick with the old lock system.
Re: Multiprocess logging mechanism
by Ido (Hermit) on Feb 26, 2002 at 10:56 UTC
    I'm not sure I understood, but if I did what you want is actually to unlock:
    use Fcntl ':flock'; #... flock(FH,LOCK_UN); #Or: flock(FH,8);
      While locks of files can be removed and the file handle held open between writes, this should not be done unless you also remember to resynchronise the file handle when you subsequently reacquire a lock on the file handle. Eg.

      flock (FH, LOCK_EX); # acquire exclusive lock seek (FH, 1, 0); # resynchronise the file buffer

      Failure to resynchronise the file handle can cause many problems where multiple processes are accessing and updating the file handle simultaneously - This aspect of file locking was discussed in detail in this thread.

       

      perl -e 's&&rob@cowsnet.com.au&&&split/[@.]/&&s&.com.&_&&&print'

Re: Multiprocess logging mechanism
by Anonymous Monk on Feb 26, 2002 at 12:29 UTC
    If the process which is forking them is avaliable to do stuff then you could fork them with an open(FH, "-|") in which case the forked stdout would go to that file handle for the main process to collate the all togeather.

    Another method is to have each one logging to seperate files and then the main process when it closes sticks them all togeather, but thats a bit dodgy I would say.

    But otherwise if your main process does need to do stuff then you probabley should go for the open+lock+write+close

    I'm not sure but you might not have to open it everytime, just lock it, but hold me to that.

    Toby
Re: Multiprocess logging mechanism
by perrin (Chancellor) on Feb 26, 2002 at 16:01 UTC
    There are several logging packages on CPAN that should handle this. The syslog route might be the simplest.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://147517]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others perusing the Monastery: (3)
As of 2021-12-07 08:43 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    R or B?



    Results (33 votes). Check out past polls.

    Notices?