View mode: basic / threaded / horizontal-split · Log in · Help
March 22, 2009
Re: Please integrate build framework into the compiler
在 Sun, 22 Mar 2009 04:19:31 +0800,grauzone <none@example.net> 写道:

> I don't really understand what you mean. But if you want the compiler to  
> scan for dependencies, I fully agree.
>
> I claim that we don't even need incremental compilation. It would be  
> better if the compiler would scan for dependencies, and if a source file  
> has changed, recompile the whole project in one go. This would be simple  
> and efficient.
>

This may not be true. Consider the dwt lib case, once you tweaked a module  
very little(that means you do not modify any interface connects with  
outside modules and code that could possible affect modules in the same  
packages), the optimal way is

dmd -c your_tweaked_module
link all_obj

That's much faster than regenerating all other object files. Yes, feed  
them all to DMD compiles really fast. Writing all object files to disk  
costs much time. And your impression of incremental compilation seems to  
be misguided by the rebuild and dsss system. Rebuild takes no advantage of  
di files, thus it have to recompile everytime even in the situation that  
the module based on all other di files unchanged. I posted several  
blocking header generation bugs in DMD and with fixes. Just so little  
change that dmd can generate almost all header files correctly. I tested  
tango, dwt, dwt-addons. Those projects are very big and some take advanced  
use of templates. So the header generation building strategy is really not  
far away.

Little self-promotion here, and in case Walter misses some of them:
http://d.puremagic.com/issues/show_bug.cgi?id=2744
http://d.puremagic.com/issues/show_bug.cgi?id=2745
http://d.puremagic.com/issues/show_bug.cgi?id=2747
http://d.puremagic.com/issues/show_bug.cgi?id=2748
http://d.puremagic.com/issues/show_bug.cgi?id=2751

In c++, a sophisticated makefile carefully build .h dependencies of .c  
files. Thus, once .h files are updated, then .c files which are based on  
them need to be recompile. This detection can be made by comparison of old  
.di files and new .di files by testing their equality.
March 22, 2009
Re: Please integrate build framework into the compiler
My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041 
contains an outdated, buggy version. Where can I find the up-to-date 
source code?

Another question, rdmd just calls dmd, right? How does it scan for 
dependencies, or is this step actually done by dmd itself?
March 22, 2009
Re: Please integrate build framework into the compiler
> Little self-promotion here, and in case Walter misses some of them:
> http://d.puremagic.com/issues/show_bug.cgi?id=2744
> http://d.puremagic.com/issues/show_bug.cgi?id=2745
> http://d.puremagic.com/issues/show_bug.cgi?id=2747
> http://d.puremagic.com/issues/show_bug.cgi?id=2748
> http://d.puremagic.com/issues/show_bug.cgi?id=2751

If it's about bugs, it would (probably) be easier for Walter to fix that 
code generation bug, that forces dsss/rebuild to invoke a new dmd 
process to recompile each outdated file separately.

This would bring a critical speedup for incremental compilation (from 
absolutely useless to relatively useful), and all impatient D users with 
middle sized source bases could be happy.

> In c++, a sophisticated makefile carefully build .h dependencies of .c 
> files. Thus, once .h files are updated, then .c files which are based on 
> them need to be recompile. This detection can be made by comparison of 
> old .di files and new .di files by testing their equality.

This sounds like a really nice idea, but it's also quite complex.

For example, to guarantee correctness, the D compiler _always_ had to 
read the .di file when importing a module (and not the .d file 
directly). If it doesn't do that, it could "accidentally" use 
information that isn't included in the .di file (like code when doing 
inlining). This means you had to generate the .di files first. When 
doing this, you also had to deal with circular dependencies, which will 
bring extra headaches. And of course, you need to fix all those .di 
generation bugs. It's actually a bit scary that the compiler not only 
has to be able to parse D code, but also to output D source code again. 
And .di files are not even standardized.

It's perhaps messy enough to deem it unrealistic. Still, nice idea.
March 22, 2009
Re: Please integrate build framework into the compiler
grauzone wrote:
> My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041 
> contains an outdated, buggy version. Where can I find the up-to-date 
> source code?

Hold off on that for now.

> Another question, rdmd just calls dmd, right? How does it scan for 
> dependencies, or is this step actually done by dmd itself?

rdmd invokes dmd -v to get deps. It's a interesting idea to add a 
compilation mode to rdmd that asks dmd to generate headers and diff them 
against the old headers. That way we can implement incremental rebuilds 
without changing the compiler.

Andrei
March 22, 2009
Re: Please integrate build framework into the compiler
在 Sun, 22 Mar 2009 12:18:03 +0800,Andrei Alexandrescu  
<SeeWebsiteForEmail@erdani.org> 写道:

> grauzone wrote:
>> My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041  
>> contains an outdated, buggy version. Where can I find the up-to-date  
>> source code?
>
> Hold off on that for now.
>
>> Another question, rdmd just calls dmd, right? How does it scan for  
>> dependencies, or is this step actually done by dmd itself?
>
> rdmd invokes dmd -v to get deps. It's a interesting idea to add a  
> compilation mode to rdmd that asks dmd to generate headers and diff them  
> against the old headers. That way we can implement incremental rebuilds  
> without changing the compiler.
>
> Andrei

The bad news is that public imports ruin the simplicity of dependencies.  
Though most cases d projs uses private imports.

Maybe we can further restrict the public imports.

I suggest we add a new module style of interfacing. Public imports are  
only allowed in those modules. Interface module can only have public  
imports.

example: all.d

module(interface) all;
public import blah;
public import blah.foo;

interface module can not import another interface module. Thus no public  
import chain will be created. The shortcoming of it is:

module(interface) subpack.all;
public import subpack.mod;

module(interface) all;
public import subpack.mod;     	// duplication here.
public import subpack1.mod1;
March 22, 2009
Re: Please integrate build framework into the compiler
davidl escribió:
> 在 Sun, 22 Mar 2009 12:18:03 +0800,Andrei Alexandrescu 
> <SeeWebsiteForEmail@erdani.org> 写道:
> 
>> grauzone wrote:
>>> My rdmd doesn't know --chatty. Probably the zip file for dmd 1.041 
>>> contains an outdated, buggy version. Where can I find the up-to-date 
>>> source code?
>>
>> Hold off on that for now.
>>
>>> Another question, rdmd just calls dmd, right? How does it scan for 
>>> dependencies, or is this step actually done by dmd itself?
>>
>> rdmd invokes dmd -v to get deps. It's a interesting idea to add a 
>> compilation mode to rdmd that asks dmd to generate headers and diff 
>> them against the old headers. That way we can implement incremental 
>> rebuilds without changing the compiler.
>>
>> Andrei
> 
> The bad news is that public imports ruin the simplicity of dependencies. 
> Though most cases d projs uses private imports.
> 
> Maybe we can further restrict the public imports.

Yes. They could give a compile-time error... always. ;-)
March 22, 2009
Re: Please integrate build framework into the compiler
On Sat, 21 Mar 2009 22:19:31 +0200, grauzone <none@example.net> wrote:
> I don't really understand what you mean. But if you want the compiler to  
> scan for dependencies, I fully agree.
>
> I claim that we don't even need incremental compilation. It would be  
> better if the compiler would scan for dependencies, and if a source file  
> has changed, recompile the whole project in one go. This would be simple  
> and efficient.

Well, why not get rid of the imports altogether... Ok, that would not be  
feasible because of the way compilers (D, C++, etc) are build nowadays.

I find adding of #includes/imports laborious. (Is this component already  
#included/imported? Where's that class defined? Did I forgot something?)  
And when you modify or refractor the file, you have to update the  
#includes/imports accordingly...

(In case of modification/refractoring) the easiest way is just to compile  
the file, and see if there's errors... Of course, that approach will not  
help to remove the unnecessary #includes/imports.

So, sometimes (usually?) I give up, create one huge #include/import file  
that #includes/imports all the stuff, and use that instead. Efficient?  
Pretty? No. Easy? Simple? Yes.

#includes/imports are redundant information: the source code of course  
describes what's used in it. So, the compiler could be aware of the whole  
project (and the libraries used) instead of one file at the time.
March 22, 2009
Re: Please integrate build framework into the compiler
Kristian Kilpi wrote:
> #includes/imports are redundant information: the source code of course 
> describes what's used in it. So, the compiler could be aware of the 
> whole project (and the libraries used) instead of one file at the time.

That's not sufficient. I'm using SDL right now; if I type 'Surface s;', 
should that import sdl.surface or cairo.Surface? How is the compiler to 
tell? How should the compiler find out where to look for classes named 
Surface? Should it scan everything under /usr/local/include/d/? That's 
going to be pointlessly expensive.
March 22, 2009
Re: Please integrate build framework into the compiler
On Sun, 22 Mar 2009 14:14:39 +0200, Christopher Wright  
<dhasenan@gmail.com> wrote:

> Kristian Kilpi wrote:
>> #includes/imports are redundant information: the source code of course  
>> describes what's used in it. So, the compiler could be aware of the  
>> whole project (and the libraries used) instead of one file at the time.
>
> That's not sufficient. I'm using SDL right now; if I type 'Surface s;',  
> should that import sdl.surface or cairo.Surface? How is the compiler to  
> tell? How should the compiler find out where to look for classes named  
> Surface? Should it scan everything under /usr/local/include/d/? That's  
> going to be pointlessly expensive.

Such things should of course be told to the compiler somehow. By using the  
project configuration, or by other means. (It's only a matter of  
definition.)

For example, if my project contains the Surface class, then 'Surface s;'  
should of course refer to it. If some library (used by the project) also  
has the Surface class, then one should use some other way to refer it  
(e.g. sdl.Surface).

But my point was that the compilers today do not have knowledge about the  
projects as a whole. That makes this kind of 'scanning' too expensive (in  
the current compiler implementations). But if the compilers were build  
differently that wouldn't have to be true.


If I were to create/design a compiler (which I am not ;) ), it would be  
something like this:

Every file is cached (why to read and parse files over and over again, if  
not necessary). These cache files would contain all the information (parse  
trees, interfaces, etc) needed during the compilation (of the whole  
project). Also, they would contain the compilation results too (i.e.  
assembly). So, these cache/database files would logically replace the old  
object files.

That is, there would be database for the whole project. When something  
gets changed, the compiler knows what effect it has and what's required to  
do.

And finally, I would also change the format of libraries. A library would  
be one file only. No more header/.di -files; one compact file containing  
all the needed information (in a binary formated database that can be read  
very quickly).
March 22, 2009
Re: Please integrate build framework into the compiler
> Such things should of course be told to the compiler somehow. By using
> the project configuration, or by other means. (It's only a matter of
> definition.)

maybe like delphi did it

there is a file called .dpr (delphi project)
which holds the absolute/relative pathes for in project used imports
it could be seen as an delphi source based makefile

test.dpr
---
project test;

uses // like D's import
  unit1 in '\temp\unit1.pas',
  unit2 in '\bla\unit2.pas',
  unit3 in '\blub\unit3.pas',
  ...
---

unit1.pas
---
  uses
    unit2,
    unit3;

interface

...

implementation

...

---


and the sources files .pas compiled into an delphi compiler specific 
"object file format" called .dcu (delphi compiled unit)
which holds all intelligent data for the compiler when used serveral 
times (if the compiler finds an .dcu he will use it, or compile the .pas 
if needed to an .dcu)

i think that, the blasting fast parser (and the absence of generic 
programming features) makes delphi the fastest compiler out there
the compiling speed is compareable to sending a message through icq or 
save a small file

did the dmd compiler have rich compile/linktime intermediate files?

and btw: if we do compiletime bechmarks - delphi is the only hart to 
beat reference

but i still don't like delphi :-)
1 2 3 4
Top | Discussion index | About this forum | D home