Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Run commands in parallel

by tilly (Archbishop)
on Aug 21, 2000 at 22:24 UTC ( #28870=snippet: print w/ replies, xml ) Need Help??

Description: This is a demo of how to keep a series of I/O bound jobs running in parallel. Aside from the dependency on /dev/null, it should be completely portable.

Turn off $debug to make it silent. And if you think you see possible improvements, you likely do. :-)

EDIT
Thanks to tye and jcwren I made it more portable by making the null data come from 'nul' under Windows. Note that if the parent is killed, spawned children will finish under both *nix and NT 4.0. I have not tested it on other platforms.

use Carp;
use strict;
use IPC::Open3;
use vars qw( $debug );
$debug = 1;

&run_parallel( 5, ['perl', '-e', 'die'], reverse map {"sleep $_"} 1..1
+0);

# The first parameter is how many jobs to run at once, the remaining a
+re
# the jobs.  Jobs may be a string, or an anonymous array of the cmd an
+d
# args.
#
# All output from children go to your STDERR and STDOUT.  They get no
# input.  It prints fairly uninformative errors for commands with
# problems, and returns a hash of problems.
#
# The jobs SHOULD NOT depend on each other!
sub run_parallel {
  my $job_count = shift;
  unless (0 < $job_count) {
    confess("run_parallel called without a positive parallel job count
+!");
  }
  my @to_start = @_;
  my %running;
  my %errors;
  my $is_running = 0;
  while (@to_start or %running) {
    if (@to_start and ($is_running < $job_count)) {
      # Launch a job
      my $job = shift @to_start;
      unless (ref($job)) {
        $job = [$job];
      }
      print "Launching '$job->[0]'\n" if $debug;
      local *NULL;
      my $null_file = ($^O =~ /Win/) ? 'nul': '/dev/null';   
      open (NULL, $null_file) or confess("Cannot read from $null_file:
+ $!");
      my $proc_id = open3("<&NULL", ">&STDOUT", ">&STDERR", @$job);
      $running{$proc_id} = $job;
      ++$is_running;
    }
    else {
      # collect a job
      my $proc_id = wait();
      if (! exists $running{$proc_id}) {
        confess("Reaped unknown process $proc_id!");
      }
      elsif ($?) {
        # Oops
        my $job = $running{$proc_id};
        my ($cmd, @args) = @$job;
        my $err = "Running '$cmd' gave return code '$?'";
        if (@args) {
          $err .= join "\n\t", "\nAdditional args:", @args;
        }
        print STDERR $err, "\n";
        $errors{$proc_id} = $err;
      }
      print "Reaped '$running{$proc_id}->[0]'\n" if $debug;
      delete $running{$proc_id};
      --$is_running;
    }
  }
  return %errors;
}
Comment on Run commands in parallel
Download Code
RE: Run commands in parallel
by merlyn (Sage) on Aug 22, 2000 at 01:38 UTC
    I've got an elaborate version of this tied up in the vault for my Linux Magazine column. I prefork a bunch of children, then essentially do a remote procedure call on them, passing arbitary (well, Storable) data to and from the kids. Kids can also request additional tasks be queued.

    I say "in the vault" because it was published last month in the magazine, but I can't put it on my website for another two months. So, keep checking back!

    -- Randal L. Schwartz, Perl hacker

RE: Run commands in parallel
by tye (Cardinal) on Aug 22, 2000 at 22:26 UTC

    Check out File::Spec for a more portable way to get "/dev/null" (included with Perl 5.6).

            - tye (but my friends call me "Tye")

Back to Snippets Section

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: snippet [id://28870]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (5)
As of 2014-09-19 02:55 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    How do you remember the number of days in each month?











    Results (129 votes), past polls