Problems? Is your data what you think it is? | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
If there was a trap, it was the trap of not reading the question correctly and indeed zeroing in on the limit mentioned when I shouldn't have. I don't want to raise any myths. Which is why I'm glad you're correcting me.
And for sounding authoritative. I'm primarily older then many of my fellow Perl monks, but definitely not more experienced in the use of Perl. So: Hashes are OK for any number of keys until you run out of memory Even if this wouldn't be the case, is there a better way to do it (without resorting to something like Judy)? However, I think you're wrong with respect to point 3. On the one hand you're saying that the number of keys per bucket stays in the same range. Then how can there be a "worst case" scenario? If Perl would be able to always keep the number of keys per bucket roughly the same, how could there be a worst case scenario? Liz In reply to Re: Re: Re: Re: Slowness when inserting into pre-extended array
by liz
|
|