Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Reading Elements of an array in Parallel

by rahulruns (Sexton)
on Feb 27, 2013 at 11:24 UTC ( #1020848=perlquestion: print w/ replies, xml ) Need Help??
rahulruns has asked for the wisdom of the Perl Monks concerning the following question:

I have an array of host name on which I need to run a command to check the availability of rpms. The list of hosts is quiet big. Is it possible to rpm check command in parallel on all hosts? I have reduced the number of hosts in the code for better display. In actual I have 100 hosts

my @hosts = ("a01", "a02"); my @rpms = ("xx1", "xx2", "xx3", "xx4", "xx5", "xx6", "xx7"); print "CHECKING THE INSALLED RPM(s)\n"; foreach (@hosts) { my $host_name = $_; foreach (@rpms) { my $rpm; if ( "$_" eq "xx4" ) { $rpm = system ("ssh $host_name rpm -qa | grep $_ | grep -vi perl &>> $ +directory_create/rpm"); } else { $rpm = system ("ssh $host_name rpm -qa | grep $_ &>> $directory_create +/rpm"); } if ( $rpm == 0 ) { print " $_ RPM ................... IS PRESENT ON HOST \n"; } else { print " Exiting..................... "; exit 1; } } #foreach rpm loop }#foreach hosts loop ends here

Comment on Reading Elements of an array in Parallel
Download Code
Re: Reading Elements of an array in Parallel
by T_I (Novice) on Feb 27, 2013 at 12:28 UTC

    I've been working with an almost similar issue and I decided to use threads to parallel process my data. This works fine when you don't have to end a thread after a time-out and start something else to get the same result. In this case I would simply wait for max. timeout for all commands to finish and then process what you have, report the jobs that didn't report back and simply exit the script. (thus freeing the resources.

Re: Reading Elements of an array in Parallel
by RichardK (Priest) on Feb 27, 2013 at 12:58 UTC

    Parallel::ForkManager looks like a good fit for what you want, it makes it easy to start a number of processes to work through a list.

    use Parallel::ForkManager; $pm = Parallel::ForkManager->new($MAX_PROCESSES); foreach $data (@all_data) { # Forks and returns the pid for the child: my $pid = $pm->start and next; ... do some work with $data in the child process ... $pm->finish; # Terminates the child process }

      Parallel::ForkManager helps in case of process forking but I am not able to understand how will it help in reading all elements at once

        I'm not sure what you mean by "read all the elements at once" can you explain further?

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1020848]
Approved by ww
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others meditating upon the Monastery: (13)
As of 2014-09-16 16:23 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    How do you remember the number of days in each month?











    Results (34 votes), past polls