Your skill will accomplish what the force of many cannot |
|
PerlMonks |
Re^3: Parallel::Forkmanager and large hash, running out of memoryby sundialsvc4 (Abbot) |
on Apr 25, 2013 at 13:05 UTC ( [id://1030664]=note: print w/replies, xml ) | Need Help?? |
A problem like that one could be handled by scanning all the files ahead of time and pushing the lookup values into a database table. This would avoid the need to “look for” the answers you want, which could largely defeat your efforts at parallelization. A pre-scanner could loop through the directory, query to see if it has seen this particular file before, and if not, grab the lookups and store them. Each time, it would only consider new files. (In the database table, you could also note whether a particular file had already been processed. Something like an SHA1 hash could be used to recognize changes.) This, once again, could be used to reduce the problem to a single-process handler that can be run in parallel with itself on the same and/or different systems. By all means, if you have now hit-upon a procedure that works, I am not suggesting that you rewrite it.
In Section
Seekers of Perl Wisdom
|
|