October 21, 2011
Walter Bright wrote:
> On 10/17/2011 5:56 PM, so wrote:
>> On Tue, 18 Oct 2011 03:52:13 +0300, Nick Sabalausky <a@a.a> wrote:
>>
>>> Thats would mean that every D compiler would have to *also* be a C
>>> compiler.
>>
>> Indeed, but i see nothing wrong with it, like i see nothing wrong with
>> inline
>> asm, C never changes,
>
> While C code can be directly translated to D, the C macros are another
> matter.

One way is to use "probably number one C/C++ preprocessor now available in the world.": http://mcpp.sourceforge.net/

It's portable, BSD licensed and implements all of C90, C99 and C++98 specifications.
October 21, 2011
The main problem with this approach is how to support different versions of a library, or of OS. It quickly becomes difficult to support anything but the latest, or a fixed version.
It works beautifully for mature libs.

I still cannot avoid thinking that a C frontend automatically generating D modules with the help of recipes would be a better way.
It will need some manual intervention for "difficult" cases, mainly giving manual translation of some macros, but it should be small.

One would set if all the files correspond to modules, or there are just some "main" directories/files.

Some things are easy:
#define a
enum { a=true }
#define b "xyz"
enum { b="xyz" }

one could be tempted to replace
	#ifdef x
with
	static if (is(typeof(x)) && x)
and treat other #if in a similar way, but in D a static if must contain a full statement, as its content must be syntactically valid, whereas the C preprocessor does not have this limitation.
The way to work around this, if we create the headers on demand is simple: we already evaluate all #if using the building definitions of the associated C compiler (gcc -E -dD for example) and its default include paths (or directly use it, keeping in account the # line file directives).

real macros are more tricky, for example one could do

#define isSmall(x) (x<2)
isSmall(T)(T x){
	return x<2;
}

#define c(x) { x , #x }
template(alias x){
	{ x, x.stringof }
}

thus c(t) has to become c!(t).

and maybe one has to provide some macros definition by hand, but I guess the cases are not so much.

In all this there is still a major pitfall: redefinitions of the same macro. It is not common, but it happens, and when it does everything breaks.
One could give different names for the clashing symbols, but it remains ugly.
Furthermore in D one cannot define the same interface to a C function twice and import it in the same scope through two modules, because it will clash, even if private.

This makes the whole more complicated, but I think that a few recipes coding the exceptions like macro translations, macros/defs to suppress or rename it should work pretty well.

Once could analyze if different "views" of the same include file are compatible, and automatically check for double definitions.
It isn't an easy project, but it would be very useful if done correctly. I remember talking about it with Lindquist quite some time ago…

Fawzi
October 21, 2011
On Oct 21, 2011, at 4:20 PM, Fawzi Mohamed wrote:

> The main problem with this approach is how to support different versions of a library, or of OS. It quickly becomes difficult to support anything but the latest, or a fixed version.
> It works beautifully for mature libs.
> 
> I still cannot avoid thinking that a C frontend automatically generating D modules with the help of recipes would be a better way.
> It will need some manual intervention for "difficult" cases, mainly giving manual translation of some macros, but it should be small.

… and it seems that in the time I was offline others came up with the same idea...

October 22, 2011
On 10/21/2011 4:32 PM, Fawzi Mohamed wrote:
>
> On Oct 21, 2011, at 4:20 PM, Fawzi Mohamed wrote:
>
>> The main problem with this approach is how to support different versions of
>> a library, or of OS. It quickly becomes difficult to support anything but
>> the latest, or a fixed version. It works beautifully for mature libs.

Since github has excellent support for branches, I don't see why this is a major problem.


>> I still cannot avoid thinking that a C frontend automatically generating D
>> modules with the help of recipes would be a better way. It will need some
>> manual intervention for "difficult" cases, mainly giving manual translation
>> of some macros, but it should be small.
>
> … and it seems that in the time I was offline others came up with the same
> idea...

It's an old idea. The trouble is, as always, the C preprocessor. I'm currently converting the openssl .h files, and they are a zoo of metaprogramming using C preprocessor macros.

People are going to demand perfect translation if it is automatic.

The only way to do it is to work with the preprocessed output of the .h file, and just forget about the preprocessor.
October 22, 2011
On 10/21/2011 12:41 AM, so wrote:
> You are right, i forgot about macros, Is it only this or is there anything else?

The only other thing is what does one do about 'char' - make it a byte, ubyte, or char D type?
October 22, 2011
On 10/21/2011 5:29 AM, Piotr Szturmaj wrote:
> It's portable, BSD licensed and implements all of C90, C99 and C++98
> specifications.

Preprocessing the text is not the problem. The problem is determining a D translation of the macros.
October 22, 2011
On 10/22/2011 01:20 AM, Fawzi Mohamed wrote:
> The main problem with this approach is how to support different versions of a library, or of OS. It quickly becomes difficult to support anything but the latest, or a fixed version.
> It works beautifully for mature libs.
>
> I still cannot avoid thinking that a C frontend automatically generating D modules with the help of recipes would be a better way.
> It will need some manual intervention for "difficult" cases, mainly giving manual translation of some macros, but it should be small.
>
> One would set if all the files correspond to modules, or there are just some "main" directories/files.
>
> Some things are easy:
> #define a
> enum { a=true }
> #define b "xyz"
> enum { b="xyz" }
>
> one could be tempted to replace
> 	#ifdef x
> with
> 	static if (is(typeof(x))&&  x)
> and treat other #if in a similar way, but in D a static if must contain a full statement, as its content must be syntactically valid, whereas the C preprocessor does not have this limitation.

D does not have this limitation either. Use string mixins. The only difference between C macros and D string mixins is that D is more explicit about that the feature is mere string manipulation. There is nothing you cannot do with string mixins that is possible with macros. (except hijacking existing code or making macro instantiations look like function calls for transparent interchangeability, of course).


> The way to work around this, if we create the headers on demand is simple: we already evaluate all #if using the building definitions of the associated C compiler (gcc -E -dD for example) and its default include paths (or directly use it, keeping in account the # line file directives).
>
> real macros are more tricky, for example one could do
>
> #define isSmall(x) (x<2)
> isSmall(T)(T x){
> 	return x<2;
> }


isSmall(x); // use macro in C code

string isSmall(string x) {
        return `{return ~x~`;}`;
}
mixin(isSmall(q{x}); // use macro in D.

or, with Kenji Hara's proposal:

mixin template isSmall(string x){
        enum isSmall = `{return ~x~`;}`;
}

isSmall!q{x} // use macro in D code


>
> #define c(x) { x , #x }

string c(string x){
        return `{ x , q{`~x~`} }`;
}

mixin(c(q{x})); // use macro in D code

or, again, with Kenji Hara's proposal:

mixin template c(string x){
        enum c = `{ x , q{`~x~`} }`;
}

c!q{x} // use macro in D code

multiple parameters would possibly be best handled like this:

mixin template ADD(string x){
    enum cc = {
        string p = x.split(",");
        assert(p.length == 2, "expected 2 parameters");
        return `( `~p[0] ~ '+' ~ p[0] ~ ` )`;
    }();
}

ADD!q{x,y} // use macro in D code

October 22, 2011
On 10/22/2011 04:33 AM, Walter Bright wrote:
> On 10/21/2011 4:32 PM, Fawzi Mohamed wrote:
>>
>> On Oct 21, 2011, at 4:20 PM, Fawzi Mohamed wrote:
>>
>>> The main problem with this approach is how to support different
>>> versions of
>>> a library, or of OS. It quickly becomes difficult to support anything
>>> but
>>> the latest, or a fixed version. It works beautifully for mature libs.
>
> Since github has excellent support for branches, I don't see why this is
> a major problem.
>
>
>>> I still cannot avoid thinking that a C frontend automatically
>>> generating D
>>> modules with the help of recipes would be a better way. It will need
>>> some
>>> manual intervention for "difficult" cases, mainly giving manual
>>> translation
>>> of some macros, but it should be small.
>>
>> … and it seems that in the time I was offline others came up with the
>> same
>> idea...
>
> It's an old idea. The trouble is, as always, the C preprocessor. I'm
> currently converting the openssl .h files, and they are a zoo of
> metaprogramming using C preprocessor macros.
>
> People are going to demand perfect translation if it is automatic.
>
> The only way to do it is to work with the preprocessed output of the .h
> file, and just forget about the preprocessor.

Another way is to replace the preprocessor with CTFE and string mixins. I think that could be automated quite easily. (modulo the possibility of some extremely heavy abuse on the C side that could make the other parts of the translation a lot harder of course)



October 22, 2011
I have a question about licensing. If you translate a C LGPL'ed header file to D, and you keep the same license, are you still allowed to use whichever license in your user code that uses the new D files? Because I don't know whether using LGPL'ed .d files falls under "using the library" or "extending the library".

For example, CairoD has translated Cairo LGPL/MPL'ed header files in equivalent Boost-licensed .d files. I know I'm asking for free lawyer advice here, but do you think this is going to be a problem?
October 22, 2011
Andrej Mitrovic wrote:
>I have a question about licensing. If you translate a C LGPL'ed header file to D, and you keep the same license, are you still allowed to use whichever license in your user code that uses the new D files? Because I don't know whether using LGPL'ed .d files falls under "using the library" or "extending the library".
>
>For example, CairoD has translated Cairo LGPL/MPL'ed header files in equivalent Boost-licensed .d files. I know I'm asking for free lawyer advice here, but do you think this is going to be a problem?

I'm interested in this as well. This is especially evil as we currently have to link statically to D code. A D file consisting of only C imports probably doesn't count as code (and there's actually no need to compile it), but if you have macros which were translated into functions or templates, wouldn't statically linking against this LGPL'd 'header' code require everything to be LGPL/GPL licensed?

-- 
Johannes Pfau