Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much

safely passing args through ssh

by perl5ever (Pilgrim)
on Jul 25, 2011 at 06:14 UTC ( #916491=perlquestion: print w/replies, xml ) Need Help??
perl5ever has asked for the wisdom of the Perl Monks concerning the following question:

One problem with ssh is that it invokes a shell to exec your remote command and the arguments are re-interpreted via that shell.

For example, let argcount be the following perl script:

#!/usr/bin/env perl print "arg count: ", scalar(@ARGV), "\n";
Then running argcount 'a b c' returns 1, but running ssh localhost argcount 'a b c' returns 3. Moreover, running ssh localhost argcount 'a;b;c' exhibits even more undesirable behavior.

So, suppose you want to execute a command like perl -e ... that will work as expected even if it executed remotely via ssh. Clearly ... cannot contain any spaces or shell meta-characters. The question is: what's a good way of encoding ... so that it will survive an ssh call?

To be specific about the problem, let backticks() be defined as follows:

sub backticks { open(my $fh, "-|", @_) or die "fork failed: $1"; do { local($/); <$fh> } }
The problem is to define a function E() such that all of these give the same results:
backticks('perl', '-e', $x); backticks('perl', '-e', E($x)); backticks('ssh', $host, 'perl', '-e', E($x)); backticks('ssh', $host1, 'ssh', $host2, 'perl', '-e', E($x));
Here $x is a perl string containing arbitraty perl source code. Note that the last backticks example likely will preclude any approach which relies on using backslashes to escape meta-characters, although I'm not totally sure about this.

Here's an example of a possible solution:

sub E { my $f = 'H'.(2*length($_[0])); return "eval(pack(q/$f/,q/" . unpack("H*",$_[0]) . '/))'; }
The idea is to hex-encode the string to ensure that the result doesn't contain any spaces or shell meta-characters.

Are there any other ways of solving this problem?

Update: Note that I am not looking to encode an arbitrary shell command. Another statement of the problem is this:

Given an array ref $invoke_perl which will invoke a perl interpreter via open(..., "-|", @$invoke_perl), and given a scalar $script containing perl source, how do I encode $script (resulting in E($script)) so that:

open(..., "-|", @$invoke_perl, '-e', E($script))
will pass the arguments ['-e', $script] to the perl interpreter?

Replies are listed 'Best First'.
Re: safely passing args through ssh
by ikegami (Pope) on Jul 25, 2011 at 08:18 UTC

    ssh takes a shell command.

    # Passing <<echo>> <<$$>> executes <<echo $$>> $ ssh echo '$$' 10920 # Passing <<echo>> <<'$$'>> executes <<echo '$$'>> $ ssh echo ''\''$$'\''' $$ # Passing <<echo '$$'>> executes <<echo '$$'>> $ ssh 'echo '\''$$'\''' $$

    So you have to build a shell command.

    sub text_to_shell_lit(_) { return $_[0] if $_[0] =~ /^[a-zA-Z0-9_\-]+\z/; my $s = $_[0]; $s =~ s/'/'\\''/g; return "'$s'"; } my $remote_cmd = join ' ', map text_to_shell_lit, perl => ( '-e' => $perl_code ); backticks(ssh => ( '--', $target, $remote_cmd ));
      I'm not sure this survives a double-ssh call:
      print backticks('ssh', 'localhost', '--', 'ssh', 'localhost', '--', +'argcount', text_to_shell_lit('a b c'));
      This prints 3 for me.

      As I mentioned in the original post, I was doubtful that backslash-escaping could solve the problem, but there still might be a way to do it.

        That doesn't look anything like what I wrote. I showed how to contruct a shell command to pass to ssh but you passed 5 arguments after the target, none of which a shell command.

        If you call ssh twice, you have two shell commands to build.

        sub text_to_shell_lit(_) { return $_[0] if $_[0] =~ /^[a-zA-Z0-9_\-]+\z/; my $s = $_[0]; $s =~ s/'/'\\''/g; return "'$s'"; } my $very_remote_cmd = join ' ', map text_to_shell_lit, argcount => ( 'a b c' ); my $remote_cmd = join ' ', map text_to_shell_lit, ssh => ( '--', $very_remote_target, $very_remote_cmd ); backticks(ssh => ( '--', $remote_target, $remote_cmd ));
    Re: safely passing args through ssh
    by JavaFan (Canon) on Jul 25, 2011 at 09:07 UTC
      Then running argcount 'a b c' returns 1, but running ssh localhost argcount 'a b c' returns 3. Moreover, running ssh localhost argcount 'a;b;c' exhibits even more undesirable behavior.
      This is more a shell question, than a Perl question. In fact, it's totally irrelevant in which language argcount is written.

      But here are some things you want to try:

      • ssh localhost argcount "'a b c'"
      • ssh localhost "argcount 'a b c'"
      • ssh localhost argcount \'a b c\'
      • ssh localhost argcount a\\\ b\\\ c
      • ssh localhost argcount "'"a b c"'"
    Re: safely passing args through ssh
    by Tanktalus (Canon) on Jul 25, 2011 at 16:05 UTC

      Here's what I do. This function doesn't really need to be an object/class method, but it does make it easier when it's in a base class to be able to call $self->quote_cmd from a derived object.

      # shellish quote. sub quote_cmd { my $self = shift; if (any { ref $_ eq 'ARRAY' } @_) { return join ' ; ', map { $self->quote_cmd(@$_) } @_; } # if we have only one parameter, assume it's a full command by # itself, already quoted (nothing else we can do anyway). return $_[0] if @_ == 1; join ' ', apply { if (not defined) { die("Undefined command parameter?"); } elsif (0 == length) { $_ = q[''] } elsif (/[\s"]/ && !/['\$]/) { s[\\][\\\\]g; s['] [\\']g; $_ = qq['$_'] } elsif (/['\$]/) { s['] ['"'"']g; $_ = qq['$_'] } } @_; }
      As for double-ssh'ing, I do that do. I just quote the whole thing again.
      # @options has options for ssh itself. my @cmd = ( qw(ssh), @options, $self->usernode(), $self->quote_cmd(@cmd) ); # for testing purposes, allow proxying. if ($ENV{SSH_TEST_PROXY}) { @cmd = ( qw(ssh), @options, $ENV{SSH_TEST_PROXY}, $self->quote_cmd(@cmd) ); }
      And then it's all taken care of. I haven't yet found a case where this doesn't work, though I haven't yet had a need to try more than two ssh calls.

      My unit tests involve a perl script that takes its args, converts to JSON, and prints it out. The .t file then gets the stdout, decodes the JSON, and compares it to what it sent in (think "is_deeply"). Spaces, quotes (both single and double), etc., all work fine.

      Note that my expectation is that the command and arguments are set up as if you were running system(@cmd), and not system($cmd). Of course, this does mean that my code doesn't really go for redirection (> /dev/null, 2>&1, etc.), but that's okay for me because I use IPC::Open3 and IO::Select to read all the output from the ssh calls on my (caller) end, and then I work with the output here. (If you try to pass in the redirection characters, this will kinda choke, but those characters could be added, and, again, it's not a concern here - this didn't go to CPAN.)

      My next project here, after a conversion to AnyEvent, is to create a wrapper script where I'll pass the parameters in a method independent of the shell, and have that script do the select bit, sending back all pertinent data via JSON (probably in chunks). The wrapper script will then be able to handle more than simple parameters (args:["a","b","c d e"]), but also set environment variables (env:{foo:"blah"},addenv:{PATH:"/some/extra/path"}), drop privileges if run as root (user:"nobody"), etc., almost like its own minilanguage :-)

    Re: safely passing args through ssh
    by salva (Abbot) on Jul 25, 2011 at 19:27 UTC
      Are there any other ways of solving this problem?

      Yes, use Net::OpenSSH!

      update (now that I have a real keyboard):

      Net::OpenSSH does the quoting for you transparently. For instance...

      $ssh->system('argcount', 'a;b;c');
      says 1.

      If arguments are passed as scalar references, an alternative quoting mechanism that lets glob patters pass unquoted is used:

      $ssh->system('ls', \'/home/jsmith/*.png');

      If you need to run your ssh commands through an intermediate machine, then you can...

      $ssh->system('ssh', $host, $ssh->shell_quote($cmd, @args));
      Though, probably, you would be able to bypass the intermediate host and get a Net::OpenSSH that runs commands directly on the remote side.
    Re: safely passing args through ssh
    by Tanktalus (Canon) on Jul 26, 2011 at 17:05 UTC

      Given your update, where you have a piece of code on machine A, and you want to execute it on machine B, my solution is to put the code in a standalone file, copy it to the remote machine, and then execute it as normal over ssh. In my case, I have a shared filesystem (machines B through Z all mount a filesystem off machine A), so this is a trivial File::Copy. However, my code actually also handles test scenarios where the shared filesystem isn't yet mounted, and starts by doing an scp first. Because I'm not sharing ssh connections, this does necessitate a second ssh connection, but in my case, that's pretty trivial, and since I now actually use sshfs to mount that shared filesystem when it isn't already mounted via nfs, I don't even need that much.

      Trying to copy arbitrary amounts of code across and execute the code all in a single go is going to be far from trivial. However, there's no reason that my previous solution wouldn't work if you formulate your code as a one-liner, e.g., if you set:

      @cmd = ( '/usr/bin/perl', -e => quote_cmd($script) );
      (don't put leading/trailing quotes on the script - the $script value should just be the code), this should be fine. However, this doesn't satisfy E(). You simply can't do it that way. You'll need to do:
      @cmd = ('perl', '-e', $script); # pick which one you need: system(@cmd); system('ssh', $host, quote_cmd(@cmd)); system('ssh', $proxy, quote_cmd('ssh', $host, quote_cmd(@cmd))); # etc., etc.,
      Basically, every time you add a new ssh layer, you'll have to re-quote_cmd it.

    Re: safely passing args through ssh
    by afoken (Monsignor) on Jul 27, 2011 at 13:40 UTC

      Let's see: ssh does not touch the STDIN, STDOUT, STDERR streams. But it invokes the remote standard shell to handle arguments. So, get rid of the shell parsing.

      Perl can read a program from STDIN, it stops reading at EOF or when it finds the magic "__END__" marker on a line. So, feed the remote Perl a program that implements some kind of read-eval-print loop (you may also call that a RPC server, if you wish), then send commands over STDIN, read results back from STDOUT.

      Working (stupid) example:

      #!/usr/bin/perl -w use strict; use IPC::Open2; # or IPC::Open3 my ($in,$out); $|=1; my $pid=open2($out,$in,'/usr/bin/ssh','foken@','/usr/bin/pe +rl'); print $in <<'__end_of_perl__'; $|=1; use strict; use warnings; print "ok, running with PID $$\n"; while (<STDIN>) { chomp; print "File $_ has a size of ",-s($_)," bytes\n"; } # the next line is important: __END__ __end_of_perl__ print "Loaded\n"; print "Reading boot message: "; my $info=<$out>; print $info; for my $file ('/etc/passwd','/etc/motd','/etc/issue') { print "Requesting $file\n"; print $in "$file\n"; print "Reading: "; $info=<$out>; print $info; }


      Loaded Reading boot message: ok, running with PID 13498 Requesting /etc/passwd Reading: File /etc/passwd has a size of 1468 bytes Requesting /etc/motd Reading: File /etc/motd has a size of 20 bytes Requesting /etc/issue Reading: File /etc/issue has a size of 24 bytes

      To do:

      • replace "foken@" with the username and hostname of the machine you want to talk with
      • replace the remote while-loop with the code you actually want to run
      • close the file handles when you are done (optional, but recommended)
      • add error checks (optional, but recommended)
      • replace the simple-and-stupid line-oriented protocol with some more advanced protocol (optional)
      • replace IPC::Open2 with IPC::Open3, open2() with open3(), and look for errors written to the remote STDERR
      • Encode data, e.g. using JSON, Data::Dumper, XML, YAML, whatever you like (optional)


      Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)
        Yes - this will work. One issue is that you have to use buffered I/O on the remote side.

        If you are able to put the bootstrap program in the command line (via -e), then you can use unbuffered I/O on STDIN, i.e. sysread.

        Unless there is a way to safely use sysread(STDIN,...) after perl has read the server program up to __END__ ... (?)

    Log In?

    What's my password?
    Create A New User
    Node Status?
    node history
    Node Type: perlquestion [id://916491]
    Approved by philipbailey
    Front-paged by Arunbear
    and all is quiet...

    How do I use this? | Other CB clients
    Other Users?
    Others exploiting the Monastery: (6)
    As of 2017-02-25 00:43 GMT
    Find Nodes?
      Voting Booth?
      Before electricity was invented, what was the Electric Eel called?

      Results (364 votes). Check out past polls.