Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change

Having several processes share a logfile

by ibm1620 (Monk)
on Jan 02, 2013 at 22:00 UTC ( #1011363=perlquestion: print w/replies, xml ) Need Help??
ibm1620 has asked for the wisdom of the Perl Monks concerning the following question:

I'm designing a multiforking server and I'd like to have all the processes write to the same logfile (correctly interleaved, of course). Is there a module recommended for that?
  • Comment on Having several processes share a logfile

Replies are listed 'Best First'.
Re: Having several processes share a logfile
by pemungkah (Priest) on Jan 02, 2013 at 22:59 UTC
Re: Having several processes share a logfile
by davido (Archbishop) on Jan 03, 2013 at 03:45 UTC

    The POD for flock demonstrates how to obtain an advisory lock, and then seek to the EOF before appending. The example is pretty simple. Look at over and I'm sure it will make sense.


      This is what I ended up doing - easy enough for development. Thanks! Chap
Re: Having several processes share a logfile
by flexvault (Monsignor) on Jan 03, 2013 at 12:18 UTC

    Hello ibm1620,

    As was already mentioned, I use the *nix 'syslog' in production. But since you're in development mode at the present time, I suggest you use a unique log file for each 'fork'ed child.

    use Sys::Syslog qw(:DEFAULT setlogsock) our ( $LOG, $Debug, $start ); $Debug = 4; ## I use 0 for production, and 1..9 for tes +ting our $NAME = "$prog-$Debug"; ## $prog should be a name to help y +ou setlogsock('unix'); openlog(" $$ Srv", 'ndelay', 'localn'); ## 'n' is defined by s +ystem admin syslog('info', "############# Starting #############"); . . . if ( $Debug == 0 ) ## This is production { open ($LOG, ">","/dev/null") || Die_RTN("$NAME: Not open-8 /d +ev/null"); } else ## This will log for levels 1..9 { open ($LOG, ">","./logs/Child$$") || Die_RTN("$NAME: Not open +-9 Child$$"); my $time = time; $start = 0; print $LOG "Info: |$time|$start|\n"; } $start++; if ( $Debug >= 4 ) ## This is for level 4 or greater debug in +formation { my $ctime = time; print $LOG "Info: |$ctime|$start|\n"; } . . . closelog(); close $LOG;

    This allows you to see what logs are generated for the production environment, and also allows you to have very detailed debugging information during development. Once code is working correctly, you can make the test for a higher number and eliminate unneeded data from the child logs.

    I open the 'syslog' before forking the children, and then open the debugging logs in the children. The '$$' is the unique process ID of the child.

    Hope this helps and that I typed everything correctly :-)

    Good Luck...Ed

    "Well done is better than well said." - Benjamin Franklin

Re: Having several processes share a logfile
by Anonymous Monk on Jan 03, 2013 at 08:33 UTC

    How about using the syslog facility on *nix? A few Perl interfaces to it can be found on CPAN.

Re: Having several processes share a logfile
by nagalenoj (Friar) on Jan 03, 2013 at 11:43 UTC

    I've used Log::Log4perl and it worked fine for the same need.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1011363]
Approved by Corion
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others contemplating the Monastery: (8)
As of 2018-06-20 06:08 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (116 votes). Check out past polls.