good chemistry is complicated, and a little bit messy -LW |
|
PerlMonks |
Re^2: Connecting Remote machines using perl.by eweaver (Sexton) |
on Dec 19, 2005 at 16:12 UTC ( [id://517774]=note: print w/replies, xml ) | Need Help?? |
I use a 128 node cluster, and the environment is fixed. Each cluster has its own local disk space accessible in /tmp/ and various places, plus a network share where you put your code. What you should be doing is putting your module in the network share. If you don't have one, you must have a very painful clustering setup to work with! On my system (SunFire/Linux) if the labstaff updates the build environment it propagates to all the nodes. Also, jobs have to be submitted by a load balancing queuer anyway. You can't get persistent storage on any _specific_ node, because you are assigned whatever nodes are free by the queuer. If you are under the impression that perl modules have to be installed in /usr/local/lib/..., and that isn't synchronized across your systems (possible, but unlucky!), you are wrong; you can install a module anywhere. Just update your lib path with a little use lib qw(/path/to/your/module); at the beginning of your perl program. You don't even have to change environment variables. If none of these seem like options, then you should probably ask your cluster administrator what you should do. Conversely you could pipe commands to `ssh` with open() or system() or something, but that's less than ideal. ~e
In Section
Seekers of Perl Wisdom
|
|