laziness, impatience, and hubris | |
PerlMonks |
Re: Efficient giant hashesby dragonchild (Archbishop) |
on Mar 10, 2005 at 13:28 UTC ( [id://438236]=note: print w/replies, xml ) | Need Help?? |
The script becomes notably slower when reaching this size.
100_000 elements is going to be, roughly, 1-2K per element. That's 100M-200M of RAM. Unless each element is some huge data structure in its own right (like an array of hashes or somesuch), you're probably not doing a lot of swapping to disk or anything like that. It sounds like your algorithm(s) aren't scaling with your data structures. For example, doing is going to be considerably slower than the equivalent foreach has to build the list in memory, then iterate over it. each will only bring one value in at a time. My bet is on your algorithms, not your data structures. Maybe if you posted a few snippets of how you use this massive hash, we might be able to help you out. Being right, does not endow the right to be rude; politeness costs nothing.
In Section
Seekers of Perl Wisdom
|
|