Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris

Daemon, log4perl & stderr/stdout

by saberworks (Curate)
on May 27, 2008 at 21:10 UTC ( #688741=perlquestion: print w/replies, xml ) Need Help??
saberworks has asked for the wisdom of the Perl Monks concerning the following question:

I have a question about forwarding STDERR & STDOUT to a log4perl object on a per-job basis (inside a while loop). Here is some background:

I have a daemon program (uses Proc::Daemon) that grabs "jobs" to process from a database. Once it has the "job" it uses the info from the database to shell out and "process" the job. The processing is basically calling a bunch of external programs in sequence and making sure the output of said programs looks sane. There's probably 8-10 steps in a "job" that are all handled by different program.

When each new job is started, a new Log::Log4perl "category" is created and this category is logged to that job's "run path" (the directory in which the commands are run). I can then call:
$log->('some global log message'); $job_log->('some log message specific to this job');
This allows me to log messages specific to a job to one log file and all other messages to a global log file. This is all working fine.

Although we hope that there will never be errors with the various external programs, we are more realistic and we know that we will at some point have to debug some errors. So I would like to forward the STDERR and STDOUT from these external programs to my $job_log log4perl object.

When Proc::Daemon runs, it somehow takes over STDOUT and STDERR so I don't see the messages. I'm not sure this matters.

The question is, how do I make it so messages written to STDOUT and STDERR are forwarded (and timestamped) by Log::Log4perl?

I have seen this FAQ answer:

Some module prints messages to STDERR. How can I funnel them to Log::Log4perl?

Which is somewhat related but doesn't really answer the question. They are using a DEBUG function call as opposed to an object $job_log->debug() call. I'm not sure this distinction matters. Any help is greatly appreciated.

Replies are listed 'Best First'.
Re: Daemon, log4perl & stderr/stdout
by pc88mxer (Vicar) on May 27, 2008 at 21:47 UTC
    If I understand your problem, you want a child process to send its STDOUT and STDERR output through a Log4Perl object that is located in the parent process.

    If so, then the parent has to process the child's output and pass it to the logging object. Here's a simple way to do this for just STDOUT:

    my $pid = open(CHILD, "-|", 'child', @child_args) if ($pid) { while (<CHILD>) { $logger->($_) } } close(CHILD);
    If it's okay to merge STDOUT and STDERR, you can do something like this:
    if (my $pid = open(CHILD, "-|")) { while (<CHILD>) { $logger->($_); } close(CHILD); } else { open(STDERR, '>&STDOUT'); exec('child', @child_args) || die "exec failed"; }
    When Proc::Daemon runs, it somehow takes over STDOUT and STDERR so I don't see the messages.
    All Proc::Daemon is doing is redirecting STDOUT and STDERR to /dev/null.
      Thanks for your reply. I guess I should be clear, when I'm running "child" processes I'm using system or backticks, not actually opening child processes. I guess I could do what you mention, although really these commands need to run in sequence, not parallel, so I'm not sure forking a child of the daemon manually like that is necessary.

      Also, about Proc::Daemon redirecting STDOUT and STDERR to /dev/null, is there a way to re-redirect them? When I don't run in "daemon" mode (I comment out Proc::Daemon::Init()), I can do this:
      open STDERR, '>', "$run_path/stderr.txt" or die "Can't redirect STDERR: $!";
      And the STDERR output from each command subsequently run using backticks `/some/command` gets written to the stderr.txt log file. But if I run in Daemon mode (uncomment the Proc::Daemon::Init() call), the STDERR & STDOUT info is getting lost. Even though I'm opening them as above. I wonder if I need to close them first. (update closing them first makes no difference, it's as if once Proc::Daemon gets ahold of them I can't redirect them elsewhere)

        You need to make sure fileno(STDERR) is equal to 2.

        fileno()==0 will be used as the child's STDIN.
        fileno()==1 will be used as the child's STDOUT.
        fileno()==2 will be used as the child's STDERR.

        The above will allow you to redirect STDERR to a file, but system file handles (as opposed to Perl file handles) can't be redirected to a Perl function (such as the log4perl handler). If you wish to do that, you'll have to use a temporary file, IPC::Open3 or something like pc88mxer's code.

        You'll be running the children in sequence because the parent will be capturing the output from each child until the child terminates before spawning the next child.

        As for STDOUT and STDERR when using Proc::Daemon, it just redirects them to /dev/null as a convenience. There's no problem re-opening them to other files. Closing those handles first is not necessary. When attempting to open an already open file handle, perl will close it first.

        If the open STDERR ... call is not working in daemon mode, perhaps it's because it can't write to or create "$run_path/stderr.txt". What are the permissions on the directory and file, and what userid is the script running as in daemon mode?

        Works for me
        #!/usr/bin/perl use Proc::Daemon qw( ); use File::Basename qw( dirname ); use File::Spec::Functions qw( rel2abs ); my $homedir; BEGIN { $homedir = dirname(rel2abs($0)); } { Proc::Daemon::Init; chdir($homedir); open STDERR, '>', 'stderr.txt'; system(q{perl -e'print STDERR "mwuhahahaha\n"'}); }
        $ $ cat stderr.txt mwuhahahaha
        Could you show us what doesn't work?
Re: Daemon, log4perl & stderr/stdout
by saberworks (Curate) on May 30, 2008 at 16:31 UTC
    Just an update on how I got step one working -- piping standard error and standard out to their own files on a per-job basis (so each job gets a separate set of log files). The suggestions for just reopening STDERR and STDOUT weren't working because I was opening and then closing them inside a while loop, and when I closed them, and then later reopened, they were getting new fileno() (as a previous poster mentioned, I should check that the number is 2). Instead of closing them at the end of the loop, I just reopened them back to /dev/null and everything worked fine. Here is some psuedocode that may make it easier to understand:
    while(my $job = get_next_job()) { my $run_path = $job->get_run_path(); open STDERR, '>', "$run_path/stderr.txt" or die; open STDOUT, '>', "$run_path/stdout.txt" or die; $job->run(); open STDERR, '>', '/dev/null'; open STDOUT, '>', '/dev/null'; # Do some other job cleanup/notification stuff that may # output to STDERR/STDOUT but I don't want in the # individual job log files do_some_other_stuff_before_going_to_next_job(); }
    Thanks everyone for the help.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://688741]
Approved by Corion
Front-paged by andreas1234567
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others meditating upon the Monastery: (6)
As of 2017-08-20 02:29 GMT
Find Nodes?
    Voting Booth?
    Who is your favorite scientist and why?

    Results (313 votes). Check out past polls.