Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask

Re^2: REST API with SFTP Pooling

by Sukhster (Novice)
on Nov 14, 2020 at 05:44 UTC ( #11123640=note: print w/replies, xml ) Need Help??

in reply to Re: REST API with SFTP Pooling
in thread REST API with SFTP Pooling

Hi Salva,

Thanks for your response. SSH is not possible, only SFTP - and only one session per Account. Each Acccount has separate files and destinations, so you can't share one Account across ALL files.

Therefore, l need to keep the session alive while servicing requests, i.e. have a shared object between requests.

Replies are listed 'Best First'.
Re^3: REST API with SFTP Pooling
by salva (Canon) on Nov 14, 2020 at 06:38 UTC
    I haven't explained clearly in my previous post.

    SSH is several things. Besides a way to run a shell in a remote machine, it is also a transport protocol that can run several communication channels in parallel between the client and the server over a single SSH connection.

    When you use SFTP, first a SSH connection is established to the server and then one channel running inside that connection is open and attached to a SFTP server.

    In your particular setup, probably the remote SSH server is configured to accept only requests for SFTP channels.

    So, now, the remote server can be limiting the number of incoming connections in two ways: (1) limiting the number of SFTP channels per user, or (2) limiting the number of SSH connections per user.

    If it happens to be (2), then you can open a SSH connection to the server, and then run several SFTP sessions in parallel over that single SSH connection.

    The interesting thing is that the OpenSSH ssh client, has a mode of operation that makes pretty easy to work in that way, establishing a connection and then running sessions (including SFTP sessions) from other programs on top of it.

    The following script would tell you if you can actually run several SFTP sessions in parallel:

    use strict; use warnings; use Net::OpenSSH; use Net::SFTP::Foreign; my $ssh = Net::OpenSSH->new($host, user => $user, password => $passwor +d, ...) $ssh->error and die "can't connect to remote host: " . $ssh->error; my $sftp1 = $ssh->sftp or die "can't open SFTP channel 1: " . $ssh->er +ror; my $sftp2 = $ssh->sftp or die "can't open SFTP channel 2: " . $ssh->er +ror; print "I am running two SFTP sessions in parallel\n"

      Hi Salva,

      I can't thank you enough for your help.

      Thanks for the further explanation, and code sample for testing

      I got "I am running two SFTP sessions in parallel" in the output from the code. So it looks like the server is allowing ONE SSH connection, but multiple SFTP sessions to be created.

      Therefore, how would l keep the SSH connection alive - what command could run? Would keeping the sftp sessions alive suffice, or periodically open a new SFTP Connection? I am about to test for the latter now.

      On a separate note, l have always used use Net::SFTP::Foreign directly, and know how to set Options like:

      use Net::SFTP::Foreign my @sftp_opts = (); push @ssh_opt, "-o"; push @ssh_opts, "KexDHMin=1024"; push @sftp_opts, "-o"; push @sftp_opts, "KexAlgorithms=diffie-hellman-group14-sha1"; . . . $sftp = Net::SFTP::Foreign->new( $config{'hostname'}, user => $config{'username'}, port => $config{'port'}, stderr_discard => 1, autodie => 0, key_path => $config{'key'}, more => [ @sftp_options ] );

      However, l cannot figure out how to set these for Net::OpenSSH; - l tried the following to no avail

      my @ssh_opts = (); push @ssh_opts, "-o"; push @ssh_opts, "KexAlgorithms=diffie-hellman-group14-sha1"; push @ssh_opt, "-o"; push @ssh_opts, "KexDHMin=1024"; my $ssh = Net::OpenSSH->new($config{'host'}, user => $config{'user'}, +port => $config{'port'}, key_path => $config{'key_path'}, default_ssh +_opts => [ @ssh_opts ]); # Returns DH parameter offered by the server (1024 bits) is considered + insecure. You can lower the accepted minimum via the KexDHMin option +. #DH_GEX group out of range: 2048 !< 1024 !< 8192 my $ssh = Net::OpenSSH->new($config{'host'}, user => $config{'user'}, +port => $config{'port'}, key_path => $config{'key_path'}, ssh_opts => + [ @ssh_opts ]); # Returns Invalid or bad combination of options ('ssh_opts')
        The way to pass extra options to the underlying ssh process from Net::OpenSSH is using the constructor master_opts argument.

        The way to share the SSH connection is to pass around the path to the multiplexing socket (you can retrieve it calling method get_ctl_path).

        You can also specify where you want to create the multiplexing socket explicitly (instead of letting the module pick a place for you) with the constructor argument ctl_path. Net::OpenSSH, by default, enforces some security policies for that path that you should obey.

        So, for every connection you want to keep open (in your case, four connections every one using a different account), you need a process that establishes the connection and runs the dummy command from time to time. For instance:

        use Net::OpenSSH; my $account_id = 1; my $ctl_path = "/.../some/known/place/to/put/the/control/path-$account +_id"; while (1) { eval { unlink $ctl_path; my $ssh = Net::OpenSSH->new($host, ..., ctl_path => $ctl_path); $ssh->die_on_error("Unable to connect to remote host"); while (1) { # run the dummy command $ssh->sftp->stat("/") or die "SFTP command failed"; sleep 10; } }; warn $@ if $@; warn "delaying before restarting SSH connection\n"; sleep 5; }
        Then from the process where you want to do the SFTP operation:
        ... my $ssh = Net::OpenSSH->new(external_master => 1, ctl_path => $ctl_pat +h); my $sftp = $ssh->sftp // $ssh->die_on_error("Unable to create SFTP ses +sion"); # at this point $sftp is a regular Net::SFTP::Foreign object you can u +se in any way you like. ...

        Most servers limit the number of channels open simultaneously in a single SSH session. For instance, it is 10 by default for OpenSSH. If you expect very many request coming in parallel, that could be a source of problems.

        Finally note that allowing direct connections between a web server and other services has some security implications. If the front end becomes compromised, intruders would be able to access those backend services and the data there freely. Because of that, sometimes using an intermediate layer as the queue mechanism proposed in my OP could still be the most sensible thing to do.

        Update: Both Net::OpenSSH and Net::SFTP::Foreign allow you to set timeouts for connecting and/or running commands. You should adjust those, to ensure that if any connection becomes stalled, it is discarded and established again, so that services recover promptly.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://11123640]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (6)
As of 2021-05-06 13:02 GMT
Find Nodes?
    Voting Booth?
    Perl 7 will be out ...

    Results (74 votes). Check out past polls.