Beefy Boxes and Bandwidth Generously Provided by pair Networks Joe
P is for Practical
 
PerlMonks  

What is the best way to dump data structures to logfiles?

by monsieur_champs (Curate)
on Mar 13, 2007 at 17:51 UTC ( #604629=perlquestion: print w/ replies, xml ) Need Help??
monsieur_champs has asked for the wisdom of the Perl Monks concerning the following question:

Fellow monks
I want to provide operations at my current company with a detailed log about parameters seen and parsed, and about data read from or written to the database. This is for day-by-day operations to support user problems, and figure out what happened to the system during determined interesting events.

Currently, the code uses Data::Dumper as dumping tool, for logging purposes. I see this as a bad thing and would like to remove this from the code: Data::Dumper is supposed to be used for debugging purposes, not for day-by-day operations, I think.

Of course, I don't know everything. I would like to know from the fellow monks what they use to pretty print complex deep data structures from source code, and what do they recommend as a good strategy to achieve good results.

Thanks you all for commenting.

Comment on What is the best way to dump data structures to logfiles?
Re: What is the best way to dump data structures to logfiles?
by jettero (Monsignor) on Mar 13, 2007 at 18:01 UTC

    I think Data::Dumper is fine for every day purposes. It's pretty fast. My guess is the object oriented interface would make things more log friendly. Also $Data::Dumper::Indent=0 to keep it all on one line.

    You might also look at JSON, which may have a way to do single line things, and if that (staying on one line I mean) isn't a concern you might even look at YAML::Syck. It's supposed to be pretty fast, since it uses a c library for ... something.

    -Paul

      Dear Paul
      I don't think that changing notation will result in solving my problem. Having a pretty printer for generic data structures in Perl should do the trick, and, as far as I am concerned, things could become nasty when I try to explain operations fellows that they will see data in a format that is not perl anymore, but isn't also "pure data", but another programming language. That's possibly too much information for them.

      BTW, thanks for you interest. The tip about YAML::Syck looks interesting and I will try to study a bit more the issue before saying something.

      Thank you!

Re: What is the best way to dump data structures to logfiles?
by Anno (Deacon) on Mar 13, 2007 at 18:04 UTC
    If Data::Dumper does the job, then what's wrong with it? It isn't a debugging tool as such, it's a tool to turn Perl data structures into readable Perl code. Looks like just what you need.

    Anno

Re: What is the best way to dump data structures to logfiles?
by dragonchild (Archbishop) on Mar 13, 2007 at 18:12 UTC
    One option here is DBM::Deep. It's not human-readable, but it's quite useful for transparent instantiation of Perl datastructures to disk.

    My criteria for good software:
    1. Does it work?
    2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?

      Dear dragonchild
      Sorry, maybe I didn't explained my problem clearly. I need some kind of data structure pretty printer, so I can send some complex (and, several times, deep) perl data structures to logfiles. There is no need to being able to fetch those structures back to the system, they're only needed as a reference to operations blokes, that need to know what was wrong about the system in a given time interval.

      I really don't see how DBM::Deep can be useful for this purpose. Maybe you can explain this to me?

      Thanks a lot for your interest, anyway.

        It sounds like you need something human-readable. If that's the case, then DBM::Deep is -not- for you.

        My criteria for good software:
        1. Does it work?
        2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
Re: What is the best way to dump data structures to logfiles?
by shmem (Canon) on Mar 13, 2007 at 18:53 UTC
    This is for day-by-day operations to support user problems, and figure out what happened to the system during determined interesting events.
    For that purpose I would focus on readability, which is not something Data::Dumper (or Data::Dump::Streamer) provide. For support tasks you need reports, and IMHO there's nothing better yet for that than format, formline and write (you could use Text::Template or Template but maybe you don't want their overhead).

    It might be a tedious task to generate the perlform reports in the first place, but that pays off since you have a better oversight of your data. You can even enrich those reports with ANSI colours to browse them in a colour capable terminal with e.g. less -R.

    --shmem

    _($_=" "x(1<<5)."?\n".q·/)Oo.  G°\        /
                                  /\_¯/(q    /
    ----------------------------  \__(m.====·.(_("always off the crowd"))."·
    ");sub _{s./.($e="'Itrs `mnsgdq Gdbj O`qkdq")=~y/"-y/#-z/;$e.e && print}
Re: What is the best way to dump data structures to logfiles?
by mercutio_viz (Scribe) on Mar 13, 2007 at 19:58 UTC

    Don't overlook a simple possibility: Data::Dump. It might be a reasonable middle ground between Data::Dumper's sheer dumping abilities and the prettiness of a structured report. Note: Data::Dump is still a 'dump' so as pretty as those dumps might be, they are still just that.

    -MC

Re: What is the best way to dump data structures to logfiles?
by talexb (Canon) on Mar 13, 2007 at 20:24 UTC
      Currently, the code uses Data::Dumper as dumping tool, for logging purposes. I see this as a bad thing and would like to remove this from the code: Data::Dumper is supposed to be used for debugging purposes, not for day-by-day operations, I think.

    Why do you see this as a 'bad thing'?

    I have one service on a Production server logging at the DEBUG level (using Log::Log4perl). This creates huge log files, but it also gives me the ability to immediately diagnose problems on a busy server. It's not pretty, but it means I can do support quickly and efficiently; disk space is cheap, so why not?

    I have in the past used the debugger to step through my code (15 years ago I did the same thing with C code that I wrote); this lets me confirm that the code is doing *exactly* what I think it does. If there's a mis-match between intent and reality, at the very least I need to be aware of that fact. Better than being aware is to just fix the code.

    Alex / talexb / Toronto

    "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

      Dear talexb
      maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

      Of course, maybe good comments here from the fellow monks helped me make up my mind about this and see the module in a different way, now.

      Maybe I can confine it on the dungeons of my logging system, and keep it well-feed with data to dump, and I will be happy, and the operations guys (that are mostly non-programmers) will also be able to tell something useful from the dumped data.

      Thank you very much for your considerations.

          maybe bad thing is too strong for express my feelings. I allways saw Data::Dumper as a debugging tool, something you should keep away from your production code.

        My query was meant to be helpful not :) accusatory -- I was just trying to challenge your thinking, and get you to back up, or prove your case.

        Perhaps you could do full logging in a file that only you see, and abbreviated logging to a file that the operations guys see. Or use the same file, and tell Operations to ignore anything at an INFO or DEBUG level.

        But by all means dump lots of data if it will help you better understand exactly how your system is working. I'd guess that in six months you'll be confident enough that you'll be able to reduce the logging level. Another strategy would be to bump the logging level up under certain unusual conditions -- that way you log lots of data only in the cases you're really interested in. Or do the inverse -- leave the logging level high, but reduce it once you discover you're doing an ordinary transaction.

        Alex / talexb / Toronto

        "Groklaw is the open-source mentality applied to legal research" ~ Linus Torvalds

Re: What is the best way to dump data structures to logfiles?
by blahblahblah (Priest) on Mar 14, 2007 at 02:52 UTC
    A lot of good options have been mentioned. I think it's important to consider who the primary audience is. If you or another knowledgable Perl programmer are reading the logs, then I like Data::Dumper or some module that does a good job of showing exactly what the perl data structures are. On the other hand, if you're aiming to make it as readable and simple as possible for a non-programmer, I might recommend something that doesn't preserve all the perl details, like JSON.

    Joe

Re: What is the best way to dump data structures to logfiles?
by radiantmatrix (Parson) on Mar 14, 2007 at 20:13 UTC

    You really haven't given all your requirements here, so I apologize if this is a useless suggestion.

    I've had success using Data::Dumper or YAML (or its XS replacement YAML::Syck) to dump structures to a file, which is then referenced in the log file. I use a sub that looks something like:

    # assumes-> use File::Temp ':POSIX'; for tempnam() # assumes-> use YAML 'Dump'; (or) use YAML::Syck 'Dump'; for Dump() sub dump { my $message = shift; my $structures = scalar @_; # yes, 'scalar' is redudant, but reada +ble # create file in curdir with prefix 'dump.' my ($dump_h, $dump_n) = tempnam('.','dump.'); print $dump_h, Dump($_) for @_; close $dump_h; $message.=sprintf '(%d dumped to "%s")', $structures, $dump_n; }

    And would be called something like:

    $logger->fatal( dump('Unable to foo, dumping object $foo_doer.', $foo_ +doer) );

    The log line would look something like:

    20070314 15:09 FATAL: Unable to foo, dumping object $foo_doer.(1 dumpe +d to "dump.foobar1")

    And the file "dump.foobar1" would contain the $foo_doer serialization.

    You can equally use Data::Dumper or any other serialization module you prefer using this model.

    <radiant.matrix>
    Ramblings and references
    The Code that can be seen is not the true Code
    I haven't found a problem yet that can't be solved by a well-placed trebuchet

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://604629]
Approved by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others scrutinizing the Monastery: (9)
As of 2014-04-19 08:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    April first is:







    Results (478 votes), past polls