|more useful options|
Mixing sysread() with <FILEHANDLE>?by wanna_code_perl (Pilgrim)
|on May 26, 2008 at 18:30 UTC||Need Help??|
wanna_code_perl has asked for the
wisdom of the Perl Monks concerning the following question:
Hello Perl experts. I'm writing a Perl client/server program that operates over a remote shell (SSH) with a simple line-based text protocol. However, I need to also transmit blobs of binary data.
The client uses open2(*Reader, *Writer, "ssh ... my_server") to get a Reader and a Writer filehandle to the remote server.
The server simply reads <STDIN> and writes to <STDOUT>
This setup works fine for the text-based parts of the protocol.
I am not sure of the best way to do the binary transfer from client to server. I tried using syswrite() from the client and sysread() on the server (basically) as like this:
Client: Sends text "----- Chunk of length $len -----\n" Server: Receives above line OK. Parses correctly and calls sysread(STDIN, $buf, $len);
Client: Sends exactly $len bytes with syswrite(Writer, $data, $len)
Server: sysread() blocks indefinitely
If I send an extra "\n" after my chunk of data, sysread() unblocks, but I am concerned that if my binary data contains newlines that this may cause other problems.
I realise it is ill-advised to mix sysread()/syswrite() with other (buffered) IO types, and that this is part of the problem. Any suggestions on either how to make this work, or a better design?