http://www.perlmonks.org?node_id=335194
bear0053 has asked for the wisdom of the Perl Monks concerning the following question:

fellow monks i have come across a situation that baffeled me. I posted this question a little while ago and only got a few responses. hopefully this post will help me determine a solution

i have an encryption object running from win32 that returns binary data in 7 bit Ascii (verified in .NET). When perl gets the property from the object and puts the value into a scalar it converts it to binary 7 bit ascii "even" parity. when .NET applications read this data from the encryption object it is simply read in as 7 bit ascii. why is this happening and what can i do to force it in as 7 bit ascii?

EXAMPLE:
in .NET C# object returns an EOT Hex 04 in 7 bit ascii

in perl the object returns EOT Hex 84 in 7 bit ascii even parity

WHY?
can it be forced to 7 bit ascii or can i convert the 7 bit ascii even parity to 7 bit ascii easily?

the convert route is not optimal for performance, the cpu is already hit from the auto conversion to 7 bit ascii even parity.

i am using a Win32::OLE object that works like this:
my $obj = lockpan(); //interacts with a vb6 dll my $enc_data = obj->encryptval($data); //returns binary val $enc_data = '0x' . unpack('H*', $enc_data); //bin -> hex #$enc_data is then written to a sql db as 7 bit ascii even parity
when the code is stored it is stored a 7 bit ascii even parity. so some how perl takes 7 bit ascii and turns it into 7 bit ascii even parity when it writes it to the database. why and how can i stop this? This would be fine if my .net application would read the data from the sql table in as 7 bit ascii even parity and not convert it. But it doesn't. Instead c#.net reads the value from the db as 7 bit ascii even parity and then when it is turned into a byte[] gets converted to 8bit and then cannot be decrypted.

Thanks in advance

----UPDATE---

This was previously asked here by me in a slightly different way: NODE 333378

I managed to solve this problem on the .net side using c# code for converting between formats. It wasn't what i was looking for but it bypasses the perl problem.
//enc_data is a hex formatted string generated by the the COM object m +y .net apps interact with. but in order for the .net app to decrypt t +he hex value returned by the same com object to perl the hex string m +ust be transformed into the appropriate parity format by doing the fo +llowing enc_data = Regex.Replace(enc_data, "^0x", ""); enc_data = FromUnicodeByteArray( hex_to_bin(enc_data) ); //now enc_data contains a value that my .net app can send to the com o +bject for decryption and get back the correct value private static byte[] hex_to_bin ( string s ) { int stringLength = s.Length; if ( (stringLength & 0x1) != 0 ) { Console.WriteLine("not even numbers"); } byte[] b = new byte[ stringLength / 2 ]; for ( int i=0 ,j= 0; i< stringLength; i+= 2,j ++ ) { int high= charToNibble( (s.Substring ( i,1 ).ToCharArr +ay())[0]); int low = charToNibble( (s.Substring(i+1,1).ToCharArra +y())[0] ); b[ j ] = (byte ) ( ( high << 4 ) | low ); } return b; } /** * convert a single char to corresponding nibble. * * @param c char to convert. must be 0-9 a-f A-F, no * spaces, plus or minus signs. * * @return corresponding integer */ private static int charToNibble ( char c ) { if ( '0' <= c && c <= '9' ) { return c - '0' ; } else if ( 'a' <= c && c <= 'f' ) { return c - 'a' + 0xa ; } else if ( 'A' <= c && c <= 'F' ) { return c - 'A' + 0xa ; } else { Console.WriteLine( "Invalid hex character: " + c ) ; return 0; } } private static string FromUnicodeByteArray(byte[] characters) { string constructedString = Encoding.Default.GetString(char +acters); return (constructedString); }