We don't bite newbies here... much | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
You could reduce the memory requirement to around 1/4 by not using 2 levels of hash. A single level will do the job:
But that will still require around 4GB to build the 50e6 key hash. Better than 16GB, but you will still run out of memory if you are using a 32-bit Perl (unless you have a very high proportion of duplicates. eg. >50%) As you say the data is presorted, investigate the uniq command With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
In reply to Re: Large file, multi dimensional hash - out of memory
by BrowserUk
|
|