Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re^3: Parallel::Forkmanager and large hash, running out of memory

by sundialsvc4 (Abbot)
on Apr 25, 2013 at 13:05 UTC ( [id://1030664]=note: print w/replies, xml ) Need Help??


in reply to Re^2: Parallel::Forkmanager and large hash, running out of memory
in thread Parallel::Forkmanager and large hash, running out of memory

A problem like that one could be handled by scanning all the files ahead of time and pushing the lookup values into a database table.   This would avoid the need to “look for” the answers you want, which could largely defeat your efforts at parallelization.   A pre-scanner could loop through the directory, query to see if it has seen this particular file before, and if not, grab the lookups and store them.   Each time, it would only consider new files.   (In the database table, you could also note whether a particular file had already been processed.   Something like an SHA1 hash could be used to recognize changes.)

This, once again, could be used to reduce the problem to a single-process handler that can be run in parallel with itself on the same and/or different systems.

By all means, if you have now hit-upon a procedure that works, I am not suggesting that you rewrite it.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1030664]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (4)
As of 2024-04-23 19:40 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found