Do you know where your variables are? | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
Unfortunately, Tie::File is not a solution to this problem. For a file of 100,000 lines, it completes, but takes around 800 times longer than List::Util::shuffle on an in memory array. So what, trade time for space. The problem is that the increase in time is exponential, meaning that shuffling the OP's 27,000,000 record file would take close to a week! However, more fundemental than that is that Tie::File simply isn't able to cope with a file this large. It runs out of memory and segfaults trying this with a 1,000,000 record file on my machine, where as I can comfortably load a 2,000,000 record file completelly into memory and sort even using the copy semantics of List:Util::shuffle. This is true regardles of what settings (default, bigger or smaller) I tried using for the memory parameter. Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham"Think for yourself!" - Abigail Hooray! Wanted! In reply to Re: Re: Re: Re: Randomize lines with limited memory
by BrowserUk
|
|