|Perl: the Markov chain saw|
a few dozen kilobytes of metadata, and then 130 megabytes of binary junk that is completely opaque to any human being.
Hmm. Let's see. 130 MB = 17,039,360 values (assuming double precision IEEE). So, if that was formatted as ASCII, assuming that same double precision is required, it would require ~20 bytes per value, ~= 325 MB to render it to the same precision in a human readable format. If we say one pair of values per line and 80 lines per page, that's ~200 reams, or 500 Kg (1/2 tonne) of cheapish printer paper. And that's before you wrap them up in the verbosity of XML.
Now, how long do you think it would take the human being to peruse that lot and pick out the anomolous pairing? And what value is there in having those values in a human readable format if no one is ever going to read them?
The point being, that to do anything meaningful with those volumes of data, it is necessary to use software.
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
In reply to Re^6: I dislike object-oriented programming in general