Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Re: Probelm getting a large compressed tar file over ssh.

by Corion (Patriarch)
on Nov 30, 2005 at 08:13 UTC ( #512880=note: print w/replies, xml ) Need Help??


in reply to Probelm getting a large compressed tar file over ssh.

You have an error in your script - the command you're sending to the remote machine is not interpolated. You likely want:

$cmd="tar -czf - $rmt_dir";

Depending on how dire your error handling situation is, avoiding Perl might be the most convenient way to resolve your memory issues:

#!/usr/bin/sh ssh -c "tar -czf - $rmt_dir" >backup.tar.gz

This will run the tar command remotely and output the created .tar.gz file directly to STDOUT, and on the local end will write the output directly into a file instead of buffering it locally in memory. You will need some error checking afterwards though.

Replies are listed 'Best First'.
Re^2: Probelm getting a large compressed tar file over ssh.
by swares (Monk) on Nov 30, 2005 at 09:09 UTC
    I used to do this as you sugested but that method requires a trusted host type relationship which I do not have in this environment. I have to conect via ssh to a number of different systems using username / password authentication.

      You don't really need a trusted host relationship, you just need passwordless keys. If passwordless keys are impossible, then you will have to use the interactive method. Maybe you can go forward by hacking Net::Ssh::Perl to redirect the STDOUT part of the connection. Looking into the source of Net::SSH::Perl::SSH1, there seem to be handlers like SSH_SMSG_STDOUT_DATA and replacing the default handler with something that doesn't accumulate the string might help:

      # Original code sub cmd { ... unless ($ssh->handler_for(SSH_SMSG_STDOUT_DATA)) { $ssh->register_handler(SSH_SMSG_STDOUT_DATA, sub { $ssh->{_cmd_stdout} .= $_[1]->get_str }); } ... }

      I would try to supply my own callback like this:

      my $ssh = Net::SSH::Perl->new(...); open my $outfh, ">", $filename or die "Couldn't create '$filename' : $!"; binmode $outfh; $ssh->register_handler('stdout', sub { print $outfh $_[1]; });
        If passwordless keys are at all possible, you could also consider File::Remote.

        This is from the example:

        use File::Remote qw(:replace); # read from a remote file open(REMOTE, "host:/remote/file") or die $!; print while (<REMOTE>); close(REMOTE); # writing a local file still works! open(LOCAL, ">>/local/file"); print LOCAL "This is a new line.\n"; close(LOCAL); # Read and write whole files my @file = $remote->readfile("host:/remote/file"); $remote->writefile("/local/file", @file);

        0xbeef

Re^2: Probelm getting a large compressed tar file over ssh.
by science_gone_bad (Beadle) on Nov 30, 2005 at 19:40 UTC
    I was following this thread, and I never saw anything about the GNU tar portion.

    I've had issues in the past where GNU tar chokes if the tar file created exceeds 2GB in size. I don't know if this has been fixed, but I was seeing this issue as late as Sept of 2004.
    Maybe...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://512880]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others imbibing at the Monastery: (5)
As of 2022-09-28 16:39 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    I prefer my indexes to start at:




    Results (124 votes). Check out past polls.

    Notices?