We don't bite newbies here... much | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
If I understand the crux of the matter is: You need to look up data in a file that is too big to hold in RAM? Perhaps you can build an index for the file that is smaller and then use that to look into the file. If performance is an issue memoizing your lookup may help. But then again premature optimization and all that... You may also want to put it into a DB. SQLite is great way if you just want to create it then use it for lookups. Not so great if you need concurrent access, then you may need a real DB server Cheers,
Pereant, qui ante nos nostra dixerunt!
In reply to Re: Parallel::Forkmanager and large hash, running out of memory
by Random_Walk
|
|