It is, at best, very *rude* to say “total garbage” in response to a post ... it doesn’t make __you__ look like a genius, merely a boor. (I searched the entire thread for the word “best” and did not find it. Perhaps your eyes are better than mine.)
That being said, __i-f__ the requirement *is* for “a __matching__ entry,” then the technique that I described will work extremely well. Strictly speaking, yes, it is possible for (say) a hash-collision to occur, but with a strong algorithm like SHA1 it basically isn’t going to happen.
*Any* exploitable “predictable fact” about the actual data can be profitably used to reduce the search-space, and (IMHO) in a situation such as this, pragmatically must be. Even something as basic as “the total number of 1’s” can be pressed into service. If you can “reasonably predict” that the differences between the searched-for array and the “best fit” that will be found will consist, let’s say, of a change-of-state of no more than (some) *n* positions, then even a brute-force search could be limited to consider only the candidates which fall into that range, perhaps starting with any exact-matches and then working outward ± *x* for *x* in *(1..n)*. This does, of course, open up the possibility of a statistical Type-2 Error (concluding that no best-match exists when in fact one does), but this might be judged to be either acceptable or necessary. (Or not ...)
If necessity really *must* become the mother of invention, and once again if you know that there are exploitable characteristics of the data, it is also possible to apply hashing or ones-counting to *slices* of the total vector. Instead of merely counting all the 1’s, count them in every (say) thousand bits. Apply some useful heuristic to this vector of sums to decide whether you choose to examine the whole thing.
In the end, the problem won’t be completely-abstract, nor will be its solution. There must be *something,* in the real world, that stipulates what is and what is not “best,” or even “plausible.” It’s my opinion that you __must__ solve this problem, at least in significant part, by reducing the total number of vectors that you *consider,* and by selecting for consideration only those which are “most likely.” Representational optimizations such as the use of bit-vectors may also be important, but even these can’t be applied in a brute-force way. You’ll simply never get the work done.
| [reply] |