Perl-Sensitive Sunglasses | |
PerlMonks |
Parallel::ForkManager Memory issueby hotel (Beadle) |
on Mar 16, 2013 at 21:35 UTC ( [id://1023853]=perlquestion: print w/replies, xml ) | Need Help?? |
hotel has asked for the wisdom of the Perl Monks concerning the following question: Hi all, Lately I've decided to use ForkManager to harvest the whole computing power available to me and to fasten things. But either I got the whole idea wrong or I cannot code it properly. I have a script... ...parse a large file (a few hundred MBs) and store some stuff in a hash called HAs H gets populated more memory is used. Next, I want to pass the reference of H to a subroutine where I want to make a system call with each and every key of H. The outputs of the software called with these system calls are independent of each other. They are basically output files. When only one processor is used to do this, it takes a lot of time, so I'd like you use all 16 cores to initiate 16 system calls simultaneously. Among other things, here's what I recently tried.
When I use top in terminal to have a look at the usage, I see that my Perl script Script started 16 processes of the same name Script, each using the same amount of memory. The amount of memory is the same with what was needed during populating H. So, obviously I'm doing something very wrong here because I was expecting to see one Script and 16 SomeBinary, each SomeBinary consuming as much memory as it requires (which should be very little). But it seems like I'm creating copies of the Script perl process. What am I getting/doing wrong? Many thanks.
Back to
Seekers of Perl Wisdom
|
|