Jump to page: 1 2
Thread overview
When compiling multiple source files
Aug 18, 2013
ProgrammingGhost
Aug 19, 2013
Jacob Carlborg
Aug 19, 2013
ProgrammingGhost
Aug 19, 2013
John Colvin
Aug 19, 2013
ProgrammingGhost
Aug 20, 2013
Jacob Carlborg
Aug 20, 2013
Jacob Carlborg
Aug 20, 2013
John Colvin
Aug 20, 2013
Jacob Carlborg
Aug 20, 2013
John Colvin
Aug 20, 2013
Jacob Carlborg
Aug 20, 2013
John Colvin
Aug 20, 2013
Timon Gehr
Aug 20, 2013
Jacob Carlborg
Aug 20, 2013
Timon Gehr
August 18, 2013
How does the compiler do static typing of multiple source files?
I heard D malloc memory and doesn't free to speed up compilation
but I am guessing every instance doesn't compile just one source
file? My question is if I have a function in this file and
another in a different file what does the compiler do when both
files needs to know the definition of another? Also how does it
handle modules?

  From another thing I heard text parsing can be ridiculously fast
so there may be no need for a binary representation of each file
parsed. Does the D compiler read all source files into memory
generate the AST then starts compiling each file? I know there
more than one compiler but I wouldn't mind hearing from either or
both if they differ.
August 19, 2013
On 2013-08-18 17:31, ProgrammingGhost wrote:
> How does the compiler do static typing of multiple source files?
> I heard D malloc memory and doesn't free to speed up compilation
> but I am guessing every instance doesn't compile just one source
> file? My question is if I have a function in this file and
> another in a different file what does the compiler do when both
> files needs to know the definition of another? Also how does it
> handle modules?
>
>    From another thing I heard text parsing can be ridiculously fast
> so there may be no need for a binary representation of each file
> parsed. Does the D compiler read all source files into memory
> generate the AST then starts compiling each file? I know there
> more than one compiler but I wouldn't mind hearing from either or
> both if they differ.

The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.

When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.

The reason for waiting until semantic analyze is done because you can have code looking like this:

mixin("import foo;");

The expansion of the mixin and other similar features are done in the semantic analyze phase.

-- 
/Jacob Carlborg
August 19, 2013
On Monday, 19 August 2013 at 11:01:54 UTC, Jacob Carlborg wrote:
> The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.
>
> When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.
>
> The reason for waiting until semantic analyze is done because you can have code looking like this:
>
> mixin("import foo;");
>
> The expansion of the mixin and other similar features are done in the semantic analyze phase.

So everything is parsed once and kept in memory until the compiler finish every source file? Is there any ram problems when compiling large codebases? My experience with D is limited. Are libraries the same as C libraries? From my understanding the linker figures that part out and the compiler needs a separate file for the definition. If I build a library in D is it the same as a C library or different which includes function definitions?

Sorry if I'm confused I know almost nothing about D. I stick to .NET, java and C++
August 19, 2013
On Monday, 19 August 2013 at 17:15:35 UTC, ProgrammingGhost wrote:
> On Monday, 19 August 2013 at 11:01:54 UTC, Jacob Carlborg wrote:
>> The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.
>>
>> When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.
>>
>> The reason for waiting until semantic analyze is done because you can have code looking like this:
>>
>> mixin("import foo;");
>>
>> The expansion of the mixin and other similar features are done in the semantic analyze phase.
>
> So everything is parsed once and kept in memory until the compiler finish every source file? Is there any ram problems when compiling large codebases?

Unfortunately, yes, if you give dmd a very large number of files all at once, it will chew through all your free RAM. But dmd does support separate compilation:

$dmd file1.d -c
$dmd file2.d -c
$dmd file1.o file2.o

which alleviates the problem.

> My experience with D is limited. Are libraries the same as C libraries? From my understanding the linker figures that part out and the compiler needs a separate file for the definition. If I build a library in D is it the same as a C library or different which includes function definitions?
>
> Sorry if I'm confused I know almost nothing about D. I stick to .NET, java and C++

Libraries in D use the same formats as C/C++ libraries.
August 19, 2013
On Monday, 19 August 2013 at 17:35:39 UTC, John Colvin wrote:
> On Monday, 19 August 2013 at 17:15:35 UTC, ProgrammingGhost wrote:
>> On Monday, 19 August 2013 at 11:01:54 UTC, Jacob Carlborg wrote:
>>> The compiler will start compiling the files passed on the command line. It will read the files asynchronously and then lex, parse build an AST and do semantic analyze.
>>>
>>> When the semantic analyze is done it will have access to all import declarations. It basically starts the same processes for all these imports, recursively.
>>>
>>> The reason for waiting until semantic analyze is done because you can have code looking like this:
>>>
>>> mixin("import foo;");
>>>
>>> The expansion of the mixin and other similar features are done in the semantic analyze phase.
>>
>> So everything is parsed once and kept in memory until the compiler finish every source file? Is there any ram problems when compiling large codebases?
>
> Unfortunately, yes, if you give dmd a very large number of files all at once, it will chew through all your free RAM. But dmd does support separate compilation:
>
> $dmd file1.d -c
> $dmd file2.d -c
> $dmd file1.o file2.o
>
> which alleviates the problem.
>
>> My experience with D is limited. Are libraries the same as C libraries? From my understanding the linker figures that part out and the compiler needs a separate file for the definition. If I build a library in D is it the same as a C library or different which includes function definitions?
>>
>> Sorry if I'm confused I know almost nothing about D. I stick to .NET, java and C++
>
> Libraries in D use the same formats as C/C++ libraries.

Is it possible that if I just try to compile 1 file it could imports enough libraries that import/need the definitions for additional large libraries which in turn also imports everything causing ram issues? I'm sure in practice this will almost never happen. But I don't doubt there are main libraries that use other large libraries and everything imports/uses everything
August 20, 2013
On 2013-08-20 00:27, ProgrammingGhost wrote:

> Is it possible that if I just try to compile 1 file it could imports
> enough libraries that import/need the definitions for additional large
> libraries which in turn also imports everything causing ram issues? I'm
> sure in practice this will almost never happen. But I don't doubt there
> are main libraries that use other large libraries and everything
> imports/uses everything

It's theoretically possible. But one big difference between D and C/C++, is that D uses symbolic inclusion where C/C++ uses textual inclusion. In C/C++ you end up with these enormous translation units due to this. This won't happen in D.

In C/C++ when you see "include <stdio.h>", for example, the preprocessor will basically copy-paste the content of stdio.h to where the include was located. In D the compiler just makes a note that a given file includes another, no content is copied.

-- 
/Jacob Carlborg
August 20, 2013
On 2013-08-20 00:27, ProgrammingGhost wrote:

> Is it possible that if I just try to compile 1 file it could imports
> enough libraries that import/need the definitions for additional large
> libraries which in turn also imports everything causing ram issues? I'm
> sure in practice this will almost never happen. But I don't doubt there
> are main libraries that use other large libraries and everything
> imports/uses everything

I guess I should add that we do have some problems with RAM running the unit tests in Phobos (the standard library). But this is rather due to the heavy use of templates and other compile time features. Not because there is too much code/text in the files.

-- 
/Jacob Carlborg
August 20, 2013
On Tuesday, 20 August 2013 at 06:50:12 UTC, Jacob Carlborg wrote:
> On 2013-08-20 00:27, ProgrammingGhost wrote:
>
>> Is it possible that if I just try to compile 1 file it could imports
>> enough libraries that import/need the definitions for additional large
>> libraries which in turn also imports everything causing ram issues? I'm
>> sure in practice this will almost never happen. But I don't doubt there
>> are main libraries that use other large libraries and everything
>> imports/uses everything
>
> I guess I should add that we do have some problems with RAM running the unit tests in Phobos (the standard library). But this is rather due to the heavy use of templates and other compile time features. Not because there is too much code/text in the files.

Hah, ram problems running the unittests..... This old laptop can't even summon enough ram to compile phobos at all!
August 20, 2013
On 2013-08-20 12:45, John Colvin wrote:

> Hah, ram problems running the unittests..... This old laptop can't even
> summon enough ram to compile phobos at all!

Haha, that's bad. How much RAM do you have?

-- 
/Jacob Carlborg
August 20, 2013
On Tuesday, 20 August 2013 at 11:08:51 UTC, Jacob Carlborg wrote:
> On 2013-08-20 12:45, John Colvin wrote:
>
>> Hah, ram problems running the unittests..... This old laptop can't even
>> summon enough ram to compile phobos at all!
>
> Haha, that's bad. How much RAM do you have?

Only 2GB

It can work... but I have to close everything else running first, otherwise it locks everything up and then crashes out complaining about not having enough memory to fork.
« First   ‹ Prev
1 2