Beefy Boxes and Bandwidth Generously Provided by pair Networks DiBona
P is for Practical
 
PerlMonks  

Re: What is the best way to dump data structures to logfiles?

by talexb (Canon)
on Mar 13, 2007 at 20:24 UTC ( #604677=note: print w/ replies, xml ) Need Help??


in reply to What is the best way to dump data structures to logfiles?

    Currently, the code uses Data::Dumper as dumping tool, for logging purposes. I see this as a bad thing and would like to remove this from the code: Data::Dumper is supposed to be used for debugging purposes, not for day-by-day operations, I think.

Why do you see this as a 'bad thing'?

I have one service on a Production server logging at the DEBUG level (using Log::Log4perl). This creates huge log files, but it also gives me the ability to immediately diagnose problems on a busy server. It's not pretty, but it means I can do support quickly and efficiently; disk space is cheap, so why not?

I have in the past used the debugger to step through my code (15 years ago I did the same thing with C code that I wrote); this lets me confirm that the code is doing *exactly* what I think it does. If there's a mis-match between intent and reality, at the very least I need to be aware of that fact. Better than being aware is to just fix the code.

Alex / talexb / Toronto

"Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds


Comment on Re: What is the best way to dump data structures to logfiles?
Re^2: What is the best way to dump data structures to logfiles?
by monsieur_champs (Curate) on Mar 14, 2007 at 10:07 UTC

    Dear talexb
    maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

    Of course, maybe good comments here from the fellow monks helped me make up my mind about this and see the module in a different way, now.

    Maybe I can confine it on the dungeons of my logging system, and keep it well-feed with data to dump, and I will be happy, and the operations guys (that are mostly non-programmers) will also be able to tell something useful from the dumped data.

    Thank you very much for your considerations.

        maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

      My query was meant to be helpful not :) accusatory -- I was just trying to challenge your thinking, and get you to back up, or prove your case.

      Perhaps you could do full logging in a file that only you see, and abbreviated logging to a file that the operations guys see. Or use the same file, and tell Operations to ignore anything at an INFO or DEBUG level.

      But by all means dump lots of data if it will help you better understand exactly how your system is working. I'd guess that in six months you'll be confident enough that you'll be able to reduce the logging level. Another strategy would be to bump the logging level up under certain unusual conditions -- that way you log lots of data only in the cases you're really interested in. Or do the inverse -- leave the logging level high, but reduce it once you discover you're doing an ordinary transaction.

      Alex / talexb / Toronto

      "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://604677]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (14)
As of 2014-04-18 14:50 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    April first is:







    Results (469 votes), past polls