Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

comment on

( [id://3333]=superdoc: print w/replies, xml ) Need Help??

I hope this isn't a stupid question, but at least I've discovered the solution to my problem, and I'm happy to share this story in the event that it helps someone else.

I've developed a pile of scripts that use Log::Log4perl, a module that I love to bits. Wanting to implement a system that would E-Mail me when there was a problem, I decided to add an E-Mail appender to my configuration. It worked fine in my test scripts, but didn't work in my work scripts, and I couldn't understand why.

I finally figured it out this afternoon (which will explain why I'm pausing to sip from my tall can of Waterloo Amber "Ontario's First Craft Brewery"). My test script looks like this:

{ Log::Log4perl->init ( $log4perl_config ); my $log = Log::Log4perl->get_logger(); .. $log->info('Some stuff'); }
This works fine. My work scripts are bigger and have subroutines in which I want to do logging, so they look like this:
my $log; { Log::Log4perl->init ( $log4perl_config ); $log = Log::Log4perl->get_logger(); .. $log->info('Some stuff'); do_something(); } sub do_something { $log->info('Some other stuff'); .. }

See the difference? I'm using what I call a file scoped variable in the work script so that I don't have to pass the logger into the subroutine (something MSCHILLI goes 'Brrrr!' to here). I believe Log::Log4perl collects log entries to go out by E-Mail until the logger goes out of scope.

So the test script works fine. The work script's logger (I thought) would also go out of scope and send the E-Mail when the script finishes -- but no E-Mail is sent. Why not?

Anyway, my workaround is going to be to call get_logger in whichever scope I need to do logging. This works fine, but I'm left wondering why doesn't Log::Log4perl see that the logger has gone out of scope in the work script and send the E-Mail? Is there a 'finalize' or 'terminate' method I could have called to trigger this? Is this an error in the behaviour? (I find that hard to believe, for a module that's been around since 2002).

Alex / talexb / Toronto

Thanks PJ. We owe you so much. Groklaw -- RIP -- 2003 to 2013.


In reply to Possible scoping issue with Log::Log4perl logger by talexb

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post, it's "PerlMonks-approved HTML":



  • Posts are HTML formatted. Put <p> </p> tags around your paragraphs. Put <code> </code> tags around your code and data!
  • Titles consisting of a single word are discouraged, and in most cases are disallowed outright.
  • Read Where should I post X? if you're not absolutely sure you're posting in the right place.
  • Please read these before you post! —
  • Posts may use any of the Perl Monks Approved HTML tags:
    a, abbr, b, big, blockquote, br, caption, center, col, colgroup, dd, del, details, div, dl, dt, em, font, h1, h2, h3, h4, h5, h6, hr, i, ins, li, ol, p, pre, readmore, small, span, spoiler, strike, strong, sub, summary, sup, table, tbody, td, tfoot, th, thead, tr, tt, u, ul, wbr
  • You may need to use entities for some characters, as follows. (Exception: Within code tags, you can put the characters literally.)
            For:     Use:
    & &amp;
    < &lt;
    > &gt;
    [ &#91;
    ] &#93;
  • Link using PerlMonks shortcuts! What shortcuts can I use for linking?
  • See Writeup Formatting Tips and other pages linked from there for more info.
  • Log In?
    Username:
    Password:

    What's my password?
    Create A New User
    Domain Nodelet?
    Chatterbox?
    and the web crawler heard nothing...

    How do I use this?Last hourOther CB clients
    Other Users?
    Others making s'mores by the fire in the courtyard of the Monastery: (2)
    As of 2025-06-22 15:13 GMT
    Sections?
    Information?
    Find Nodes?
    Leftovers?
      Voting Booth?

      No recent polls found

      Notices?
      erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.