Keep It Simple, Stupid | |
PerlMonks |
Re^4: Out of Memory when generating large matrixby LanX (Saint) |
on Mar 06, 2018 at 10:38 UTC ( [id://1210395]=note: print w/replies, xml ) | Need Help?? |
Sure, but the OP is talking about counting thru 197x8000 records, that's at max 1.5 million hash entries when every entry is just counted once, IIRC that'll result in 150 MB RAM (at max, that's a pathological case) IF the RAM wasn't even sufficient for counting, presorting a giant file wouldn't help. (wc could help but an output with 1.5 million colons should be avoided...) I'd surely opt for a DB like SQLite. But I suppose only some thousands of the most frequent K-mers are of interest. (I can imagine a solution with Hash Of Hashes for counting. Only the most relevant hashes are kept in memory while the others are swapped out, but I this would extent the scope of this thread.)
Cheers Rolf
In Section
Seekers of Perl Wisdom
|
|