in reply to Store large hashes more efficiently
With 10 million, the size is getting in your way, and maybe even causes swapping (I have no idea about your process size limits). If that happens, speed will be your first problem.
As you didn't tell other resource limits or process requirements, I'd just wanted to note the I created Tie::Hash::DBD to "fix" a similar problem. In my case, my hash ran into a couple of 100_000 entries, and tieing the hash with DB_File was not a solution, as it could not cope. As I was using a database anyway, I thought I might use it. The hash got a lot slower in the beginning, but the overall process time got halved, and with the option to "keep" the hash in the database, subsequent processes gained a lot.
With how you described your problem, Tie::Hash::DBD will probably not solve your problem at hand, but it might be something to look at when it does.
Enjoy, Have FUN! H.Merijn