Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?

Re: perl hashes

by Marshall (Canon)
on Jul 15, 2009 at 07:03 UTC ( #780181=note: print w/replies, xml ) Need Help??

in reply to perl hashes

I routinely use hashes of 50K or 100K keys and sometimes several at once! This is not an issue if you have enough memory. Also, the Perl hash table is very efficient performance wise. If you know what you are looking for, no other data structure is as efficient.

I suspect that your trouble will be the same as mine. Where does this stuff that goes into the hash come from? In my case the data comes from text files and the "care and feeding" of these text files just dwarfs any hash table initialization or should I say that getting the data off the disk and ready to be inserted into the hash and creating the text output files is what takes the VAST majority of the MIPs, split(), regex and such. I/O is "expensive". Anyway 50K keys is not what I would consider a huge hash. If your app gets slow, then look at your I/O stuff and benchmark it. You can make some significant performance gains there with some experimentation.

Anyway I/O is gonna be the performance problem, not the hash itself.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://780181]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (4)
As of 2022-08-18 16:53 GMT
Find Nodes?
    Voting Booth?

    No recent polls found