Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Re^4: Saving and Loading of Variables

by madbombX (Hermit)
on Jul 19, 2006 at 16:28 UTC ( #562346=note: print w/replies, xml ) Need Help??

in reply to Re^3: Saving and Loading of Variables
in thread Saving and Loading of Variables

Your correct about the data loss thing. I am ok with slight loss (meaning about 100 messages or so) and my plan was to write the information out to a file every 100 messages. I can deal with some data loss.

Its not so much that multiple runs are made, but that the message tests will be iterated a few times in order to get the combination values needed. Meaning aside from the BAYES_XX+TEST_X combination (which will be counted every time, I will specify which test(s) I want to catch in combination (which will require multiple iterations over a message). For instance, if I find that URIBL_SBL hits frequently in SPAM messages, then I will want to see what also hits frequently in SPAM messages and I will have a URIBL_SBL+TEST_Y category that will be created and "Totaled". The "Value" will always be the same is more for reference purposes than anything else.

This all again brings me back to my question of what is the best way to store all this information on disk? Data::Dumper, Storable, FreezeThaw (amongst others) have been suggested, but I am curious as to the most efficient method for large amounts of data considering that it isn't just hashes and arrays but semi-complex data structures as well.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://562346]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (9)
As of 2018-06-20 08:22 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (116 votes). Check out past polls.