in reply to Parallel::Forkmanager and large hash, running out of memory
Strange as it may initially seem to suggest it ... perhaps the very best approach to this problem would be to create a simple-minded program that “finds one TSV file and converts it to RDF,” then, if necessary, to (e.g. from the command-line ...) spawn as many concurrent copies of “that one simple-minded program” as you know that you have CPUs. Reduce the problem to a simple subset that “can be parallelized, if necessary,” among a collection of one-or-more processes that do not (have to) care if other instances of themselves exist. “One solitary instance” can solve the problem. “n instances” can merely do it faster. Q.E.D.