Oracle has a nasty ADO bug that affects 10.2.0.x Windows clients, and I believe 11g R1 clients as well.
When you run a query like select N'X' from dual in an ADO recordset and you are using AL16UTF16 as the national characterset, then if you look at the value of DefinedSize and ActualSize in ADO, you will get a DefinedSize of 1 and an ActualSize of 2.
That's obviously wrong, because DefinedSize is the maximum capacity of the field, and in this case it will be returned as a adVarWChar type. Therefore, it should be being returned as 2, as the letter 'X' in AL16UTF16 is represented as 2 bytes and as it's an adVarWChar type that's all it needs to store the value.
Now if it was the supplementary Han character 𧉧, which is U+27267, the ActualSize will be 4 and DefinedSize will be 4 also. That's because in UTF-16, the actual encoding is 0xD85C-DE67, which is 4 bytes long.
This particular issue has caused my company a great deal of grief, and it took Oracle a long time to acknowledge there was a problem. However, we have been told that this will be fixed in Windows patch bundle 26 for the Oracle 10g client (bug number is 8301952), and even better I believe that it was fixed in Oracle 11g R2.
Wednesday, November 4, 2009
Subscribe to:
Posts (Atom)