so let's see if I'm reading this right:
in reply to Re^2: how to merge many files of sorted hashes?
in thread how to merge many files of sorted hashes?
but you don't say how many distinct keys there are (i.e., how finely binning is grouping things).
- you have 100 points
- for every possible triple (1,000,000 combinations) you're computing a matrix
- each matrix gets pushed 100 times (onto 100 different keys)
- you start a new file every 1000 matrices (so there'll be 1000 files when you're done, and each file will have 100,000 matrices.
But it doesn't matter. For every single key you are doing a linear search through every file. So that's (number of keys) * 100,000,000 matrices to wade through; no wonder this is slow.
You need to be writing each %small_hash in sorted order. Sorting 1000 small_hashes will be faster than sorting the big hash
(1000*(n/1000)*log(n/1000) = n log(n/1000) < n log(n)
but the really important thing is this will enable the merge I was talking about before, which only reads each file once rather than once for every key value.