Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw

Reading Elements of an array in Parallel

by rahulruns (Beadle)
on Feb 27, 2013 at 11:24 UTC ( #1020848=perlquestion: print w/replies, xml ) Need Help??
rahulruns has asked for the wisdom of the Perl Monks concerning the following question:

I have an array of host name on which I need to run a command to check the availability of rpms. The list of hosts is quiet big. Is it possible to rpm check command in parallel on all hosts? I have reduced the number of hosts in the code for better display. In actual I have 100 hosts

my @hosts = ("a01", "a02"); my @rpms = ("xx1", "xx2", "xx3", "xx4", "xx5", "xx6", "xx7"); print "CHECKING THE INSALLED RPM(s)\n"; foreach (@hosts) { my $host_name = $_; foreach (@rpms) { my $rpm; if ( "$_" eq "xx4" ) { $rpm = system ("ssh $host_name rpm -qa | grep $_ | grep -vi perl &>> $ +directory_create/rpm"); } else { $rpm = system ("ssh $host_name rpm -qa | grep $_ &>> $directory_create +/rpm"); } if ( $rpm == 0 ) { print " $_ RPM ................... IS PRESENT ON HOST \n"; } else { print " Exiting..................... "; exit 1; } } #foreach rpm loop }#foreach hosts loop ends here

Replies are listed 'Best First'.
Re: Reading Elements of an array in Parallel
by RichardK (Parson) on Feb 27, 2013 at 12:58 UTC

    Parallel::ForkManager looks like a good fit for what you want, it makes it easy to start a number of processes to work through a list.

    use Parallel::ForkManager; $pm = Parallel::ForkManager->new($MAX_PROCESSES); foreach $data (@all_data) { # Forks and returns the pid for the child: my $pid = $pm->start and next; ... do some work with $data in the child process ... $pm->finish; # Terminates the child process }

      Parallel::ForkManager helps in case of process forking but I am not able to understand how will it help in reading all elements at once

        I'm not sure what you mean by "read all the elements at once" can you explain further?

Re: Reading Elements of an array in Parallel
by T_I (Novice) on Feb 27, 2013 at 12:28 UTC

    I've been working with an almost similar issue and I decided to use threads to parallel process my data. This works fine when you don't have to end a thread after a time-out and start something else to get the same result. In this case I would simply wait for max. timeout for all commands to finish and then process what you have, report the jobs that didn't report back and simply exit the script. (thus freeing the resources.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1020848]
Approved by ww
[MidLifeXis]: More opportunities for anonymous snafus :-)
[MidLifeXis]: In a small group / company, at least you know who needs to buy the next round of beer for their mess-up.
[MidLifeXis]: :-)
[Discipulus]: are sure you are not really named "%Firstname% %Lastname%..." ? have you check your identity card?
[Discipulus]: Bobby Tables?

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (11)
As of 2017-01-19 12:19 GMT
Find Nodes?
    Voting Booth?
    Do you watch meteor showers?

    Results (170 votes). Check out past polls.