I routinely use hashes of 50K or 100K keys and sometimes several at once! This is not an issue if you have enough memory. Also, the Perl hash table is very efficient performance wise. If you know what you are looking for, no other data structure is as efficient.
I suspect that your trouble will be the same as mine. Where does this stuff that goes into the hash come from? In my case the data comes from text files and the "care and feeding" of these text files just dwarfs any hash table initialization or should I say that getting the data off the disk and ready to be inserted into the hash and creating the text output files is what takes the VAST majority of the MIPs, split(), regex and such. I/O is "expensive". Anyway 50K keys is not what I would consider a huge hash. If your app gets slow, then look at your I/O stuff and benchmark it. You can make some significant performance gains there with some experimentation.
Anyway I/O is gonna be the performance problem, not the hash itself.