Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re^4: bcvi - run vi over a 'back-channel'

by grantm (Parson)
on Feb 06, 2008 at 00:08 UTC ( [id://666424]=note: print w/replies, xml ) Need Help??


in reply to Re^3: bcvi - run vi over a 'back-channel'
in thread bcvi - run vi over a 'back-channel'

You certainly don't needto do xhost +. That would only help if gvim was running on the remote server and connecting back to X on your workstation. The whole point of bcvi is to avoid forwarding X (and requiring gvim and X libraries on the server). Instead, bcvi uses a forwarded socket to pass one message back to your workstation which then invokes gvim with an scp-style file path.

If you type bcvi filename on the server and get Can't connect to 'localhost:10019' then that suggests perhaps the SSH daemon on the server has been configured to refuse TCP forwarding requests. You can find out for sure by running a command like this to connect to the server:

ssh -v -R 10019:localhost:10019 servername

The '-v' will mean you get a lot of debugging output which ought to include a line like this:

debug1: remote forward success for: listen 10019, connect localhost:10 +019

but in your case there may be an error indicating that the 'remote forward' was not successful. The fix would be to make sure the sshd_config on the server includes:

AllowTcpForwarding yes

Even if you can't make bcvi work, you should still be able to edit a remote file like this:

gvim scp://server//path/to/some/file

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://666424]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others meditating upon the Monastery: (4)
As of 2025-07-17 21:19 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.