May 09, 2011 Why does toUTF16z only work with UTF8 encoding? | ||||
---|---|---|---|---|
| ||||
toUTF16 can take a char[], wchar[] or dchar[]. But toUTF16z can only take a char[]. Why? I'm storing some text as dchar[] internally and have to pass it to WinAPI Unicode functions which expect null-terminated UTF16 strings. But toUTF16z only works with char[] for some reason. |
May 09, 2011 Re: Why does toUTF16z only work with UTF8 encoding? | ||||
---|---|---|---|---|
| ||||
Posted in reply to Andrej Mitrovic | I guess this should do it: const(wchar)* toUTF16z(in dchar[] s) { return (toUTF16(s) ~ "\000").ptr; } |
Copyright © 1999-2021 by the D Language Foundation