February 07, 2007
== Quote from Ivan Senji (ivan.senji_REMOVE_@_THIS__gmail.com)'s article
> Andrei Alexandrescu (See Website For Email) wrote:
> > The ability to transform true code trees will come with D's macro abilities. But that's a few months ahead at least.
>
> I can't believe that so many hours have passes since this post and no one has asked for some details?

I was going to, but I'm still righting the mental furniture that got flipped by the last one.

Most of the template meta-stuff seems to be done using recursive patterns, (maybe because it is easier to solve the "compile time halting problem" by limiting recursive depth?)

So since parse trees are usually thought of as recursion friendly, I imagine it would allow you to take a class as a template argument in the way that you can currently take a tuple, and do either foreach or C[i..j] style processing of fields, subclasses, etc.

A lower level abstraction would be to actually take a parse tree and just hand it to you, and you could iterate over it, something like casting a "char[][int]*" to "AA*".  But this kind of low level approach would mean the compiler would forever need to support the same parse tree layouts, which is undesireable.

I guess the question depends on how people actually use this kind of thing; how do people use LISP macros in real world code?  It's sort of a language-writing language, but for some reason its hard to find non-toy examples.  (On the other hand, when I do its hard to read them.)

Kevin
February 07, 2007
BCS wrote:
> Walter Bright wrote:
>> BCS wrote:
>>
>>>
>>> The point is to have all of the versioning done by the time you link, that leaves a runtime check for version info.
>>
>>
>> Not if it's a const.
> 
> if it's a const than it should be a static if.
> 
> static if(globalversion.baz)
>     if(baz) break;
> else
>     break;
> 
> and that still doesn't cover the other case
> 
> switch(i)
> {
>   version(foo)
>     case 1:
> 
>   ...
> 
>   version(!foo)
>     case 1:
> }
> 
> or how about
> 
> outer: while(...)
> {
>  for(...)
>  {
>   ....... // lots of nesting
>         version(Foo)
>          break outer;
>         else
>          continue outer;
>  }
> }
> 
>>> All 32 possibilities??? What if there are 16 independent versions? that's 64K functions! And no that is not an unlikely case, say "i" is a parse tree and we want to add different types of annotation depending on what features are enabled.
>>
>>
>> I'd use bit flags instead of versions for such things. 
> 
> Runtime checks? That would requirer that code to do the processing be compiled in for all cases: Code bloat, etc. And structures would then need to have all the fields for all the features[*] even if they will never be used: Data bloat etc.
> 
> Or are you saying use "static if"? Then what is version for? In that case I can't think of any use AT ALL for version.
> 
> Strike that, versions can be specified on the command line so they could do this:
> 
> module globalversion;
> 
> version(Foo) const bool Foo = true else const bool Foo = false;
> ....
> 
> and then everything is done with static ifs
> 
> *version isn't just for controlling code inclusion.
> 
> struct Part
> {
>     version(Foo) Foo foo;
>     version(Boo) Boo boo;
>     version(Fig) Fig fig;
>     version(Baz) Baz baz;
>     version(Bar) Bar bar;
> }
> 
>> If I had a situation with 32*16 version combinations, I think I'd seriously consider reengineering what the program considers as a "version". After all, do you really want to generate 64,000 binaries? How are you going to test them <g>?
> 
> Most of the cases where I see version used I would expect to have several orders of magnitude more combinations possible than are ever actually built.
> 
> What I would want versioning for would be to be able to arbitrarily select what I want from a set of functionalities. Then by specifying that on the command line, run a build (like with bud or a makefile that doesn't known jack about versions) and get what I want.
> 
> I'm at a loss as to what you envision for versioning.

All this discussion is moot. The feature exists now, use it how you like. If you want to use mixin(Config!(import(foo.conf))) to make your program n-dimensionally configurable, go ahead. If you agree with Walter's view of versioning, don't. I don't see that we need to have this discussion at all.

Unless and until Walter restricts the kinds of files import will accept, go ahead and use the feature in any way you like.
February 07, 2007
Yauheni Akhotnikau wrote:
> On Wed, 07 Feb 2007 22:18:28 +0300, Walter Bright <newshound@digitalmars.com> wrote:
> 
>> Yauheni Akhotnikau wrote:
>>> Do you think this task can be done with D templates at complile time?
>>
>> Yes, that's exactly the intent. If this can't be made to work, we'll fix D so it can.
> 
> May be I'm wrong, but I think that 'static if' and recursive templates (and other techniques available for metaprogramming at compile time) are not as powerful as ordinary D itself. So it is much more preferable to me to program such DSL as 'normal' D program. May be it is a good idea to make 'staged' compilation? For example DSL transformation code is written as ordinal D programm. Then that code compiled at first compilation stage, then it is invoked by compiler and the result is placed into input to the next stage.
> 
> Something like that:
> 
> // active_record.d
> // DActiveRecord implementation.
> module active_record;
> 
> // DSL transformator.
> char[] DActiveRecord( char[] input ) { ... }
> 
> ===
> 
> // demo.d
> // DActiveRecord usage.
> module demo;
> 
> import active_record;
> 
> // Function DActiveRecord will be called at compile time.
> mixin( DActiveRecord( "class Account ... end" ) );
> 
> ===
> 
> Two points must be highlighted here:
> * code of DActiveRecord must be used only at complite time and threw out from the resulting application code;
> * multiple stages must be allowed: for example, DActiveRecord may depend on another DSL and so on.
> 
> --Regards,
> Yauheni Akhotnikau

I agree with the point that metaprogramming needs more control structures.

static for, static foreach, static while, static do, static switch case, etc.
February 08, 2007
BCS wrote:
> Walter Bright wrote:
>> BCS wrote:
>>> The point is to have all of the versioning done by the time you link, that leaves a runtime check for version info.
>> Not if it's a const.
> 
> if it's a const than it should be a static if.

That depends. if and static if have many differences in how they work. But if will do constant folding if it can as a matter of course.

> switch(i)
> {
>   version(foo)
>     case 1:
> 
>   ...
> 
>   version(!foo)
>     case 1:
> }

C'mon,
	case 1:
		if (foo)
			...
		else
			...


> or how about
> 
> outer: while(...)
> {
>  for(...)
>  {
>   ....... // lots of nesting
>         version(Foo)
>          break outer;
>         else
>          continue outer;
>  }
> }

If someone in my employ wrote such a thing, they'd have a lot of 'splaining to do. Version statements don't always have to be at the lowest level possible - they can always be moved out to higher levels, until you find the right abstraction spot for it.

> What I would want versioning for would be to be able to arbitrarily select what I want from a set of functionalities. Then by specifying that on the command line, run a build (like with bud or a makefile that doesn't known jack about versions) and get what I want.
> 
> I'm at a loss as to what you envision for versioning.

I think you view version as a scalpel, while I see it as more like an axe.
February 08, 2007
Kyle Furlong wrote:
> Unless and until Walter restricts the kinds of files import will accept, go ahead and use the feature in any way you like.

I suppose it's like identifier naming conventions. There are ways to do it that I feel are wrong, and there are ways that are right. The D compiler doesn't care, it'll compile either. Coding style standards are something altogether different from language standards.

Remember the C thing:

	#define BEGIN {
	#define END }

	...
	BEGIN
		statements...
	END

? That was pretty popular in the early days of C, when a lot of C programmers came from Pascal and tried to make C look like Pascal. Experience eventually showed that this was just not a good style, and is   strongly frowned upon today. Even so, every C compiler will accept such code if you want to write it.

February 08, 2007
Kyle Furlong wrote:
> Yauheni Akhotnikau wrote:
>> On Wed, 07 Feb 2007 22:18:28 +0300, Walter Bright <newshound@digitalmars.com> wrote:
>>
>>> Yauheni Akhotnikau wrote:
>>>> Do you think this task can be done with D templates at complile time?
>>>
>>> Yes, that's exactly the intent. If this can't be made to work, we'll fix D so it can.
>>
>> May be I'm wrong, but I think that 'static if' and recursive templates (and other techniques available for metaprogramming at compile time) are not as powerful as ordinary D itself. So it is much more preferable to me to program such DSL as 'normal' D program. May be it is a good idea to make 'staged' compilation? For example DSL transformation code is written as ordinal D programm. Then that code compiled at first compilation stage, then it is invoked by compiler and the result is placed into input to the next stage.
>>
>> Something like that:
>>
>> // active_record.d
>> // DActiveRecord implementation.
>> module active_record;
>>
>> // DSL transformator.
>> char[] DActiveRecord( char[] input ) { ... }
>>
>> ===
>>
>> // demo.d
>> // DActiveRecord usage.
>> module demo;
>>
>> import active_record;
>>
>> // Function DActiveRecord will be called at compile time.
>> mixin( DActiveRecord( "class Account ... end" ) );
>>
>> ===
>>
>> Two points must be highlighted here:
>> * code of DActiveRecord must be used only at complite time and threw out from the resulting application code;
>> * multiple stages must be allowed: for example, DActiveRecord may depend on another DSL and so on.
>>
>> --Regards,
>> Yauheni Akhotnikau
> 
> I agree with the point that metaprogramming needs more control structures.
> 
> static for, static foreach, static while, static do, static switch case, etc.

Static loops are not very useful without compile-time mutation.

Andrei
February 08, 2007
Anders F Björklund wrote:
> Walter Bright wrote:
>> I've also tried the "make an API for the version" method, and have been much more satisfied with it. You can see it at work in the gc implementation (see gclinux.d and win32.d).
> 
> OK, see what you mean. And the Makefile would pick which
> implementation gets used, for the common interface chosen ?

Yes, that's a reasonable way to do it.

> Not sure there's a whole world of difference between the:
> 
> version(Win32) // or foowin32.d
> void foo() { ...this... }
> version(linux) // or foolinux.d
> void foo() { ...that... }
> 
> and
> 
> void foo()
> {
> version(Win32)
> ...this...
> version(linux)
> ...that...
> }

There isn't if the functions are trivial. But when they get more involved, it becomes a mess. How many times have you had to compile with just the preprocessor just to figure out which branch of the rat's nest of #if's was actually getting compiled?

Many years ago, when assembler coding was popular, Microsoft produced a macro library to make coding in assembler sort-of-but-not-quite-like coding in some pseudo-high level language. The layers of macros was so bad that a friend of mine, in order to work on such code written by others, resorted to assembling it, *disassembling* the result, and pasting the disassembled source back into the source file and started over.

> But any rate, it's preferred to handle it outside of D. OK.
> Either through rewriting the code, or in the Makefiles. OK.
> 
> With autoconf, I normally want the SAME piece of code to
> work on all platforms - so it does a different approach.
> 
> For D, I would instead write one piece of code for EACH
> platform and avoid versioning it (as much as possible) ?

Yes, in the end, I think that's a more maintainable solution. You'll find your core modules will become much more portable, and you shouldn't need to edit (or even understand) them at all when porting to a new platform.

If your job is to port the gc to a new XYZ platform, would you find it easier to edit the long and complicated gcx.d, or just copy gclinux.d to gcXYZ.d and restrict your work to just figuring out how to port a few lines of code with (hopefully) well-defined behavior?


> I've worked with the "./configure && make" approach for
> many years and kinda like it, but will try out new ideas.
> Then again I kinda like the preprocessor too, and prefer
> writing C over C++, so maybe I'm just stuck on the old. :-)
> 
> Not changing anything for the "ported" projects (like wxD),
> but it will be something to keep in mind for future D ones.
> Will do some thinking on how it would apply to options...
> (choices, as opposed to just platform/portability macros)
> 
> --anders
February 08, 2007
Walter Bright wrote:
> Ivan Senji wrote:
>> Andrei Alexandrescu (See Website For Email) wrote:
>>> The ability to transform true code trees will come with D's macro abilities. But that's a few months ahead at least.
>>
>> I can't believe that so many hours have passes since this post and no one has asked for some details?
>>
>> Is this something you and Walter talked about? (Because it sounds too good to be true.) What will those macros be like?
>> Examples...?
> 
> Nothing at the moment but a lot of:
> 
> ....
> <insert magic here>
> voila!

That's pretty much what I had in mind :o).

Actually, I did post about that in a thread in the digitalmars.d group. Search for the title "Idea : Expression Type".

> Andrei has been mostly hard at work on the const/inout/scope problem. The current way D does it is hackish, and Andrei feels (and I agree) that bringing rigor to it will make D a considerably stronger language.

There's good progress on that!


Andrei
February 08, 2007
Walter Bright wrote:
> I think you view version as a scalpel, while I see it as more like an axe.

Hear Hear! <g>
February 08, 2007
janderson wrote:
> Andreas Kochenburger wrote:
>> BLS wrote:
>>> I guess here is a need for further explaination.
>>>
>>> Either I am an complete idiot (not completely unrealistic) and missunderstood something, or a new, quit radical, programming paradigmn change is on it s way.  I mean it is difficult to realize the implications.
>>> Bjoern
>>
>> I am not a D programmer (yet) only observing what is happening.
>>
>> I compare the new "code generation at compile-time" stuff in D with Forth. Forth also has a built-in interpreter & compiler which extends the language and can also execute macros at compile-time through EVALUATE. Of course Forth is much more low-level than D. But IMO the new mixins are not a "radical programming paradigm change".
>>
>> Perhaps I just did misunderstand something?
>>
>> Andreas
> 
> I'm not familiar with forth.  Can you provide some examples?  Does it allow partial macro definitions.  Can you apply string operations on them at compile time?
> 
> -Joel

Forth is a Polish, as opposed to reverse Polish, language.
Every word understood by the system is a command.
Forth doesn't exactly HAVE a grammar.  What it has is two stacks.  (Sometimes more, but the additional stacks are optional, and their usage is inconsistent at best.)
Forth commands generally operate on the stack.  There are exceptional commands, those marked IMMEDIATE, which operate on the input stream.

E.g.:
1 means push a 1 to the top of the stack.  (This is typical of all integer literals.  Floats are, or were, not standardized.)

+ means add together the top two items on the stack (removing them from the stack) and push their sum onto the top of the stack.

dup means take the value (number?) at the to of the stack and, without removing it, push it to the top of the stack.

does means examine the top of the stack.  If it's true, continue execution, if not, skip down to the word following the end marker.

It's been too long or I'd give a few more examples.  The important thing to note is that words are executed without respect to grammar, but with the ability to determine their context.

Forth is very similar to LISP, only with a simpler grammar. I.e., the grammar is simple serial execution, with  certain words (those marked immediate) able to manipulate the input stream to determine what will be the next in order.

N.B.:  I'm discussing a basic Forth system, approximating FIG-FORTH, which is as close to standardized as Forth gets. There have been considerable variations.  My favorite was Neon, an Object-Oriented Forth for the Mac from Kyria Systems (now long defunct).  If they hadn't died while attempting to transition to MSWind95 I might have ended up as a Forth programmer.

But don't confuse Forth, in any of it's variations, with D. Forth didn't really create compiled code...and it also wasn't an interpreter in any normal sense of the term.  (I'd say isn't, but I'm not really familiar with current Forths.) Forth was what was called a "Threaded Interpretive Language".  It might be best to say that you didn't program Forth, you built a software machine that when run did what you wanted to program to accomplish.

OTOH, looked at from a different angle, the closest analog to Forth is Smalltalk.  But Smalltalk has too much grammar. Still, they both have that system library that is grown by users...and which makes the environment both richer and more complicated to use than languages like D, where the libraries are more distinct from the language.