|Pathologically Eclectic Rubbish Lister|
Re^4: Best technique to code/decode binary data for inter-machine communication?by flexvault (Prior)
|on Aug 16, 2012 at 13:13 UTC||Need Help??|
First I'm interested in the code sample you gave:
Currently I write that as:
Is your code a shorthand for the above?
Second, as you and others have pointed out, I did not use 'binmode' after opening the socket. If I were to add the following:
To both the client and server code, would I be in 'binary' mode on windows, *nix, etc. or would I need to have different client code for each. Reading the latest 'binmode' documentation, it sounds like the function would be ignored on some systems and then used where binary and text definitions differ.
Third, 'Storable' does not produce 'network neutral' results, so can't be used in this case.
Fourth, if someone passes a ':utf8' key/value pair to my application and I store the variables in an external file as ":raw", will they be able to use the data as utf8 when they receive the key/value pair back. Until I read the 'binmode' documentation, I didn't think of that possibility!
"Well done is better than well said." - Benjamin Franklin