Thread overview
[Issue 24158] ImportC: enums created from string literal #defines don’t implicitly convert to const(char)* in D.
October 29

Walter Bright <> changed:

           What    |Removed                     |Added
                 CC|                            |

--- Comment #1 from Walter Bright <> ---
That is probably because the C literal becomes a char[2] instead of a const(char)[2]. I'll look into that.

November 19

--- Comment #2 from Walter Bright <> ---
The problem is at expressionsem.d(4188):

            if (sc && sc.flags & SCOPE.Cfile)
                e.type = Type.tchar.sarrayOf(e.len + 1);
                e.type = Type.tchar.immutableOf().arrayOf();

I'm not sure what the solution is. I'm not sure this even should be fixed, after all, C semantics are different.

Although, as a workaround,


will work.

November 19

--- Comment #3 from ---
One of the issues with the current state is that if you generate a .di file from the C file, you get a D enum with the expected behavior, but if you directly import the C file, you trigger the reported problem.

As these are enums collected from macros, they don’t need to follow C semantics. The C code has already been preprocessor expanded and so don’t use these defines directly. The collected enums are just for the convenience of the D user and can follow D semantics.

I ran into this issue when using importC with SDL. Their docs and examples show code like:

    SDL_bool ok = SDL_SetHint(SDL_HINT_NO_SIGNAL_HANDLERS, "1");

where SDL_HINT_NO_SIGNAL_HANDLERS is a #define. I was able to work around it
with the .ptr workaround, but it seems like the more that examples “just work”
the better.

November 19

--- Comment #4 from ---
So the problem is that the string literal is being treated with C semantics in a D file, when you want it treated with D semantics as it is the result of a preprocessor-created enum.

November 20

--- Comment #5 from Walter Bright <> ---
A .di file has D semantics, even if it was generated from a .c file. Hence, it will always be imperfect, subject to the impedance mismatches between D and C.

If the C semantics of a string literal were changed to match D, then they would no longer work for C files.

If a special case was added to D implicit conversion rules so a char[2] was implicitly converted to const(char)*, then who knows what D code will break that relied on overloading differences between them.

I cannot think of a solution that resolves this.

November 20

--- Comment #6 from Walter Bright <> ---
> it seems like the more that examples “just work” the better.

I completely agree with that, but we also can't break things.

November 20

--- Comment #7 from ---
I think I was a little unclear with my previous comments. What I meant is that these enums are inserted by the compiler into the C code. But they are really D enums that happen to come from a C file. They should follow D rules - be an immutable(char)[2] or whatever, not a char[2] because they are not C enums, they are D enums that just happen to live in a module resulting from a C file. The C code can’t access them, only D code can. The check `sc.flags & SCOPE.Cfile` should actually be false.

It might not be with the effort to fix that, I don’t know. It is a minor inconvenience to need to write .ptr.