If it was me, for that sort of size of data I wouldn't be using any sort of tied hash. Do you need a hash to pass in to someone else's API? (In which case you couldn't assume they wouldn't force all data into ram at once anyway).
What's the scenario that you need to use it in? What format does your data come in? If you need to do fairly sequential access then there are a lot of storage/access options. If you need a lot of random access I'd probably suggest importing the data into an sqlite database (a one-time process) and then refer to that data from your analysis script. Sqlite will automatically do memory based caching etc. for you so it should be surprisingly fast (and free you from worrying about memory utilisation).