Re^7: Best technique to code/decode binary data for inter-machine communication?by BrowserUk (Pope)
|on Aug 16, 2012 at 18:20 UTC||Need Help??|
I like that shorthand format, N/(N/a*)* and N/(n/a*)* , look very flexible for future encoding/decoding uses.
the spec says '22 bits', but Perl code seems to use '24 bits' for each character with the high-order bits being '00'.
utf-8 is a variable width encoding. Each character can require from 1 to 4(*) bytes.
(*Or 6 bytes depending upon the wind direction and the phases of the moon(**).)
<smaller>(**which moon is left unspecified :)</smaller>
With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.