|Syntactic Confectionery Delight|
Hi monks ,
i have a question for you. It is more of a theoretical question, question requiring a tip rather then the actual code. so the problem is i have this bit-vector of size N:
and what i need to figure out is the position of an overall minimum given two indexes i,j. so if this is the vector and 1 marks the point of increase and 0 marks the point of decrease in some not known value,then the vector can be converted into :
Essentially what i have here is a Cartesian tree. and this means that there are only certain types of indexes that can be picked:
so given two indexes B,B I wolud like to figure out the min value between them which in my example is 8 on the position B.
current solutions all include dealing with this problem by precomputing the Sparse table and then do constant time picks-> and this is really fast, but in my case i cannot create an array type B nor a classical Sparse table since i'm bound by dealing only with bits, meaning, all i have and can work with are the arrays of type A (bit-vector). so my current solution is to divide the A into chunks and then evaluate if the overall min of the chunk is greater or smaller then the overall min of the previous chunk and then create another bit-vector reflecting the relative overall growth or decay and then do this in a tree fashion until i reach the symmetrical min (here what i have is a binary tree). problem with this approach is that it runs rather slow since i cannot do constant picks like in the Sparse table and construction complexity increases from n -> n log n. since i'm running this on big datasets, putting one such structure in memory (using 64 bit OS) memory requirements grow up to 300 GB of ram (i tested it the other day on the university cluster :))
so my question is , does anyone have an idea on how to do this type of searches without building the actual tree atop the initial bit-vector (A) like i'm doing now.