http://www.perlmonks.org?node_id=11123604

Sukhster has asked for the wisdom of the Perl Monks concerning the following question:

Hi All,

<< First Time Post >>

I have been using Perl for some years, but mostly for Regex/ DBI usage and would like to use it provide a solution for sending files over SFTP via a REST API. The reason for this is that we need to send a couple of thousand files to a remote server, which has the following restriction on the SFTP connection:

The team is working on a Java Spring solution to the above . . .. but it taking ages, and l would like to ensure that we have a solution and Perl has always been a faster dev curve for me.

I believe that l need to following:

  1. Process which listens to a socket, and forks/create-a-thread for each request
  2. Create a Pool of SFTP connections with Account ID as the primary key for each
  3. Allow this Pool to be shared amongst other processes
  4. When the SFTP connection is returned to the Pool:
    1. we need to run "version" or "ls" every 25 seconds
    2. wrap the SFTP connection into an object which has a isalive function
  5. Return success/ failure after each request

I can see quite a few Server libraries, which do 1 and 5 above - but don't understand how l could share an Object (pool of SFTP connections between them) ... and then more importantly, how l can ensure that there is a thread/while-loop keeping the connection alive.

If libraries/ approach can be provided with code samples or pseudo code - that will really help get me started.

Help Ye Old Perl Monks of Lore !

Replies are listed 'Best First'.
Re: REST API with SFTP Pooling
by salva (Canon) on Nov 12, 2020 at 11:20 UTC
    Sharing SFTP connections between processes or threads is not a good idea. Too complicated and too difficult to get it right*.

    A simplest approach would be to have 5 workers, every one with a dedicated SFTP connection, listen for requests on the same socket/pipe/queue/whatever. When a new request arrives, the first one able to catch it handles it.

    On the part of the code where it waits for new requests, a timeout can be set in order to send the dummy command if nothing happens for a while.

    *) well, unless the limit is not on SFTP connections but in SSH connections. It is pretty easy to reuse SSH connections with something like Net::OpenSSH, and then run SFTP on top of it.

      My initial thoughts were similar. Were I to do something like this I'd use it as an excuse to play with Mojolicious and its worker queue Minion. I'd make a worker task which would manage the SFTP connection, and you could have your Mojo app implementing the REST API. The worker task would maybe enqueue a keepalive request to itself for whatever interval (and to be fancy if it gets real work before then maybe cancel the prior keepalive and requeue a new one).

      </slightly specific handwaving>

      The cake is a lie.
      The cake is a lie.
      The cake is a lie.

        Indeed, doing that with Minion seems pretty easy.

        In order to keep the SFTP connection alive, you can just set a timer (with Mojo::IOLoop::timer) for sending the dummy commands.

      Hi Salva,

      Thanks for your response. SSH is not possible, only SFTP - and only one session per Account. Each Acccount has separate files and destinations, so you can't share one Account across ALL files.

      Therefore, l need to keep the session alive while servicing requests, i.e. have a shared object between requests.

        I haven't explained clearly in my previous post.

        SSH is several things. Besides a way to run a shell in a remote machine, it is also a transport protocol that can run several communication channels in parallel between the client and the server over a single SSH connection.

        When you use SFTP, first a SSH connection is established to the server and then one channel running inside that connection is open and attached to a SFTP server.

        In your particular setup, probably the remote SSH server is configured to accept only requests for SFTP channels.

        So, now, the remote server can be limiting the number of incoming connections in two ways: (1) limiting the number of SFTP channels per user, or (2) limiting the number of SSH connections per user.

        If it happens to be (2), then you can open a SSH connection to the server, and then run several SFTP sessions in parallel over that single SSH connection.

        The interesting thing is that the OpenSSH ssh client, has a mode of operation that makes pretty easy to work in that way, establishing a connection and then running sessions (including SFTP sessions) from other programs on top of it.

        The following script would tell you if you can actually run several SFTP sessions in parallel:

        use strict; use warnings; use Net::OpenSSH; use Net::SFTP::Foreign; my $ssh = Net::OpenSSH->new($host, user => $user, password => $passwor +d, ...) $ssh->error and die "can't connect to remote host: " . $ssh->error; my $sftp1 = $ssh->sftp or die "can't open SFTP channel 1: " . $ssh->er +ror; my $sftp2 = $ssh->sftp or die "can't open SFTP channel 2: " . $ssh->er +ror; print "I am running two SFTP sessions in parallel\n"
Re: REST API with SFTP Pooling
by Sukhster (Novice) on Nov 14, 2020 at 06:03 UTC

    Hi Fletch,

    Many thanks for the suggestion to use Mojolicious. I have never come across it, but it looks to be what l need - especially with the addition of Minion (and salva's method addition).

    Currently looking through the Mojolicious Tutorial (had to remove the link due to it stopping this post), and . . . . how to start :)