Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Forking server for ssh tunnels

by tinypig (Beadle)
on Feb 25, 2006 at 16:45 UTC ( [id://532780]=perlquestion: print w/replies, xml ) Need Help??

tinypig has asked for the wisdom of the Perl Monks concerning the following question:

We have a server that is written in Perl and uses IO::Socket. When it receives a connection, it does a little bit of work and then if all goes well, it forks and execs the appropriate ssh tunnel based on information passed during the initial connection. It forks because it hangs around to monitor the ssh tunnel.

Here are the problems: this particular perl takes up a great deal of memory, and as time goes on, there are more and more instances out there due to forking. I also do not think it's cleaning up the connections properly.

The bottom line is this: I can't help thinking that we are doing this on too low a level and I am considering re-writing it. Is this something that would be good to use POE for?

Replies are listed 'Best First'.
Re: Forking server for ssh tunnels
by zentara (Archbishop) on Feb 25, 2006 at 20:15 UTC
    Well your question is sort of a "looking for ideas" one. If your ssh tunnels are not passing huge amounts of data, you could try to run a collection of them with IO::Select. IO::Select will jump from filehandle to filehandle and handle them, as needed. But SSH is quite complicated, and I don't know if there may be some glitch in running separate SSH instances under 1 interpreter. Forking them definitely is safer. Threads would be another possibility, but they use alot of ram too. Why not just spend $100 for another gig of ram? :-) It will probably be cheaper than all the time to write and test reliable scripts.

    I'm not really a human, but I play one on earth. flash japh
      The SSH Perl implementation Net::SSH::Perl has some support for non-blocking operation but not enough to be run inside a select loop.

      I think the best option would be to create the tunnels with IPC::Open2 and to use a unique perl process written around a select loop to control them and listen for new connections.

      ... though it's not clear to me to what kind of "ssh tunnels" the OP refers, if he is talking about using the stdin and stdout of the ssh process to tunnel data (as in tar cf - . |ssh foo tar xf -) or using ssh native support for tunnels (i.e. ssh foo -L1234:host:1234)

        Yes, I think what you're suggesting sounds like what we should have been doing in the first place. I will look along those lines.

        As to the hows, we're doing:

        my $pid = open $f, "-|" or exec @cmd; # Where @cmd contains the path to the ssh binary and arguments for the creation of an ongoing tunnel.

        Still not sure if I'm giving enough information, but you guys have helped put me on a better track, I think.

Re: Forking server for ssh tunnels
by zentara (Archbishop) on Feb 26, 2006 at 19:34 UTC
    I was just building perl5.8.8 today and realized someone should have mentioned that you can make a "shared so" perl, which would significantly reduce the memory usage. My (typical) static perl is a little over 1 meg. If you build a "shared perl so lib", your Perl instances will probably be under 100k each, and all share 1 perl.so lib.

    When you build Perl, you are asked if you want to do this. Although salva's advice of examining how you are opening ssh is probably important.


    I'm not really a human, but I play one on earth. flash japh
      Ah, that is fantastic! I had forgotten that was one of the things I was meaning to look into. Based on what I'm seeing, I do not think we have it configured that way. Thanks for the suggestion. ++
      Well, not really, modern operating systems do not clone the full memory image when forking but use a copy-on-write machanism, so they would share the perl binary code and most of the heap data anyway.

      I think, that even if you run the same process several times, the OS will not use new memory for the additional copies becauses it mmaps the executable.

      Compiling perl as a dynamic library is only useful if you are compiling it to build different executables, for instance, perl and Apache+mod_perl.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://532780]
Approved by Arunbear
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (2)
As of 2024-06-22 11:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.