The stupid question is the question not asked | |
PerlMonks |
Re^2: Trouble with fork()by phonybone_monk (Initiate) |
on May 22, 2017 at 22:03 UTC ( [id://1190898]=note: print w/replies, xml ) | Need Help?? |
I glossed over the initial processing of the file because I didn't think it was relevant to the problem at hand for two reasons, but I'll expand. The input file is processed by open-software bio-informatics tools called "bowtie2" and "samtools". They are invoked by use of a system() call, and they write temporary files. All of that seems to work fine. After those programs are finished, my program (which I inheritied, btw) does its forking and further processing. So here's the funny thing. This program is running successfully on a MacPro with 48Gb of RAM and 24 CPU cores. On my linux box, with 128Gb of RAM and 40 CPU cores, the program fails as I've described. I'm assuming, because of the way that it's failing (some of the child processes die immediately with an error code of 11 (EAGAIN), that I'm running out of some critial resource and my fork is failing. What I can't figure out, and the answer that I'm seeking, is which resource? Is it memory? Swap space? Open file descriptors? Something else? How do I figure this out, other than trial-and-error raising of each limit in turn, which is what I've been trying to do. So am I entirely confident my algorithms are correct? Given that they're running just fine on a smaller machine, I'm reasonably confident that they are, and that my problem is somehow related to the number of forks I'm running (one per logical CPU core).
In Section
Seekers of Perl Wisdom
|
|