belg4mit has asked for the wisdom of the Perl Monks concerning the following question:
I'm updating a LONGBINARY column in a Jet 4 (MDB) database on Windows with DBD::ODBC and have encountered an odd issue. Data that should look like:
Ends up being inserted like this:
i.e; there is a null byte inserted between each legitimate byte of data. This seems similar maybe to an issue discussed here
When I enable some DBI tracing $DBH->trace($DBH->parse_trace_flags('SQL|odbcconnection|2|odbcunicode')); I get the following:
DBI::db=HASH(0x24d0eb8) trace level set to 0x6000100/2 (DBI @ 0x0/ +0) in DBI 1.634-ithread (pid 10652) -> prepare for DBD::ODBC::db (DBI::db=HASH(0x24d0f90)~0x24d0eb8 'U +pdate Room Set FloorUValue=? WHERE Number=?') thr#519fb8 SQLPrepare Update Room Set FloorUValue=? WHERE Number=? Processing non-utf8 sql in unicode mode <- prepare= ( DBI::st=HASH(0x24d35b8) ) [1 items] at db_ODBC.pl li +ne 90 -> execute for DBD::ODBC::st (DBI::st=HASH(0x24d35b8)~0x24d45c8 '. +.... ............α@..PA..(B..pB' 1) thr#519fb8 <- execute= ( 1 ) [1 items] at db_ODBC.pl line 91 -> DESTROY for DBD::ODBC::st (DBI::st=HASH(0x24d45c8)~INNER) thr#5 +19fb8 <- DESTROY= ( undef ) [1 items] at db_ODBC.pl line 70 3000000040021000400000000000000000000e04000005140000822400000724! -> + DESTROY for DBD::ODBC::db (DBI::db=HASH(0x24d0eb8)~INNER) thr#519fb8 + SQLDisconnect=0 ! <- DESTROY= ( undef ) [1 items] during global destruction
Most material out there on Unicode issues with databases seem to be with people having character data treated as bytes, whereas I seem to be in the opposite predicament.
I've also tried utf8::downgrade and explicitly setting the encoding to UCS-2LE to no avail. DBI::data_string_desc returns "UTF8 off, non-ASCII"
Any assistance would be much appreciated.
UPDATE: I have the problem whether I use the default DBD::ODBC from Strawberry Perl, or one built specifically with Unicode disabled.
In Bob We Trust, All Others Bring Data.