ralph2014 has asked for the wisdom of the Perl Monks concerning the following question:
Morning perl monks!
Im just testing the waters here, im fairly new to perl. what I need to do is this:
I will have a text file with 4000 IPs in it
I need to go through this list as quickly as possible, for each IP I will start an SSH connection and run several commands and catch the stdout of the last command that will return a comma separated output which will need to include the IP and the output and put this into a CSV file
ALL of the last commands output then need to go into one CSV file for all the IPs.
I was thinking of loading the IP list into memory rather than cycling through it. I was thinking of then forking rather than full on threads and temporarily writing the forks output to a memory file (because forks accessing the same csv would cause problems) and then finally write the output to the csv. I also intend to use LOG4PERL to keep a track of things and another file for when SSH has failed to connect.
The PC this will be run is a HP 4 core server with raid 5 drives.
SO guys I would be grateful for some input to see how you would do this.