Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Fork Results in thousands of processes

by anshumangoyal (Scribe)
on Nov 28, 2011 at 10:56 UTC ( #940344=perlquestion: print w/ replies, xml ) Need Help??
anshumangoyal has asked for the wisdom of the Perl Monks concerning the following question:

I am having an issue with Fork in Perl. I want to execute 10 Fork Processes at a go from one single script and all 10 Child (Forked) processes will do the same thing (Copy files from one place to another).
while ($callCount <= $totalCalls) { for (1..$TotalProcessToFork) { print "Call -> $callCount"; if($pid = fork) { #in Parent Process print " :: PID -> $pid\n"; push(@list_of_pid, $pid); } else { #in Child Process `touch $callCount`; } $callCount++; } }
Now when I execute this code, there are around 1000 child processed which are executed. Can any one tell me what wrong I am doing here.

Comment on Fork Results in thousands of processes
Download Code
Re: Fork Results in thousands of processes
by ikegami (Pope) on Nov 28, 2011 at 10:59 UTC
    { # in Child Process ... exit(0); <- Missing }
    Or:
    { # in Child Process exec('touch', $callCount); die $!; }
Re: Fork Results in thousands of processes
by ikegami (Pope) on Nov 28, 2011 at 11:00 UTC
    { # in Child Process ... exit(0); <- Missing }
    Or:
    { # in Child Process exec('touch', $callCount); die $!; }
      But now my parent process dies when i use exec('touch', $callCount); die $!; in child process. Here is the code:

      while ($callCount <= $totalCalls) {

      for (1..$callPerSec) {

      printf "Call -> $callCount";

      my $pid = fork();

      if($pid) {

      #in Parent Process

      printf " :: PID -> $pid\n";

      push(@list_of_pid, $pid);

      } elsif($pid == 0){

      #in Child Process

      exec ('touch', $callCount);

      die $!;

      } else {

      printf "Resource Not available\n";

      }

      $callCount++;

      }

      print "Total Calls Executed $callCount\n";

Re: Fork Results in thousands of processes
by sundialsvc4 (Monsignor) on Nov 29, 2011 at 15:48 UTC

    Modules such as Parallel::ForkManager might be useful.

    CPAN modules like these are “simple sugar,” to be sure, in that they do not do anything that you could not do for yourself, but they do make life very easy ... they let you focus more on what you want to do, and less on exactly how to do it.   The example in the documentation is virtually identical to what you are trying to do here.

    You might also wish to search http://search.cpan.org for modules like Thread::Queue.   (There are, at the moment, 33 such modules that are found by that search.)   Because it may well be that you want to start a limited number of processes (say, ten ...) and then to have each of those processes consume an arbitrary number of work-requests that you have provided for them in a queue.   In other words, by launching (say...) ten workers, you ensure that ten requests at a time will be handled simultaneously; and, by stuffing (say...) 1,000 names or commands into a queue, you provide the total list of commands that will be carried out, cooperatively, by the workers in that thread-pool.   Stuff the queue full of filenames, followed by enough “please die now” instructions for all of the workers to eventually receive one.   Then, simply wait for them to die off on-command.

    The nice thing about a design like this, is that it has a very convenient throttle.   You can set the number of worker threads up or down, and this has nothing to do with the amount of work that is to be accomplished.   Even if the backlog grows very large indeed, the completion rate will remain steady and predictable and can be effectively “tuned” to suit the hardware.

      How are you going to use Thread::Queue which uses shared process memory with Parallel::ForkManager which creates multiple processes?

      Why do you keep blurting out these pointless, useless, whole wrong replies to questions on subjects you obviously have no practical knowledge whatsoever?

      Do you just hate newbies? Or just delight on sending them off down blind alleys for your own amusement?


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority".
      In the absence of evidence, opinion is indistinguishable from prejudice.

        The two comments are obviously not intended to be so closely related, mine unintended enemy, as anyone who is not “mine unintended enemy” would very plainly understand.   (Do you downvote my posts both as yourself and using some sock-puppet account?   Just curious.)

        There are two very-valid points contained in the aforementioned reply:

        1. That you don’t need to futz around with too-much manually written code ... and the bugs and annoyances that go with it, e.g. in the OP’s original solution ... because CPAN is well-equipped with solutions that other folks have already written.
        2. That it is very useful, in any multi-thread or multi-process solution, to use “work to do” queues, with a limited number of workers who consume entries from the queue.   The number of workers is independent from the number of requests that need to be serviced.   An existing CPAN package is employed to implement the queue and all of its perhaps operating-system-specific ugliness.

        What, then, is “the point?”   Easy.   At first, you approach this language with the notion that “I have to write this.”   Then, you discover that, thanks to CPAN, you don’t “have to write this.”   You do not have to stumble-around with code that is literally a take-off of Unix fork() when there is, perhaps unbeknownst to you, a package like Parallel::ForkManager at your beck and call.   That’s huge.   But not obvious.

        Quite frankly, good sir, your ongoing determination to conduct a vendetta against me clouds your own vision.   I think that even “newbies” are likely to be professional computer programmers who are very able indeed to know when a particular package, cited only as a “for instance,” is or is not appropriate to their particular project.   Yeah, I’ll just betcha that they do know that “threads and processes play by different rules in most operating systems,” without either you or I having any moral obligation to educate them concerning this point.   So perhaps we should all simply trust the newbies on that score, as peculiar as that thought may initially seem.   I will wager that neither you nor I are, in fact, “the smartest kids in this school.”   And, yeah, I am perfectly aware that you have earned many times more experience-points than have I, so I am not presuming that this discussion is “among equals.”

      Did you mean Proc::Queue (which limits the number of forks you can do) ?

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://940344]
Approved by ikegami
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others scrutinizing the Monastery: (10)
As of 2014-07-30 09:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (230 votes), past polls