Just another Perl shrine  
PerlMonks 
Comment on 
( #3333=superdoc: print w/replies, xml )  Need Help?? 
Consider an arbitrary number of lists of arbitrary size:
As a "bag" this could be expressed as { A => 4, B => 2, C => 3, D => 1 }. The total number of elements from all lists (or all bag elements) is ten (but the general problem could have any resultset size, based on the inputs). Zip (or distribute) all bags into a single list where each bag is as uniformly distributed as possible. Many problems will have more than one solution, but any single "as optimal as possible" solution is adequate. The above bags could distribute like this:
One solution I considered was to convert each list into an iterator that spits out its contents at frequencies separated by undef, then zip the iterators. Each iterator would need to know its priority (its relative list size), and use that priority to pad the starting point with 'undef'. Something like this:
This isn't exactly the same as the sample output I sought, but I think it's an equally optimal output, so it should be fine. But one problem is that we would need to sort the bags by size so that the largest always picks first, and so on. Why am I hooked on the idea of an iterator generating each sublist as a frequency interspersed with undef? ...because it seems in my mind to be a more general approach, allowing for the possibility of infinite streams, possibly useful for task scheduling by priority. So what is my question? Several:
There isn't anything mission critical; I just recently was reading a discussion and started thinking... we're satisfying curiosity here. Last night I spent some time reading over the docs for List::Gen, which may actually be useful as it facilitates lazy generation, but I couldn't quite get my head around a List::Gen approach. Dave In reply to Bag uniform distribution algorithms by davido

