Jump to page: 1 24  
Page
Thread overview
Reggae v0.0.5 super alpha: A build system in D
Apr 03, 2015
Atila Neves
Apr 03, 2015
Dicebot
Apr 03, 2015
Dicebot
Apr 03, 2015
Atila Neves
Apr 03, 2015
Dicebot
Apr 03, 2015
Atila Neves
Apr 03, 2015
Dicebot
Apr 03, 2015
Atila Neves
Apr 03, 2015
Dicebot
Apr 03, 2015
Ben Boeckel
Apr 03, 2015
Dicebot
Apr 03, 2015
Atila Neves
Apr 03, 2015
Jacob Carlborg
Apr 03, 2015
weaselcat
Apr 03, 2015
Dicebot
Apr 04, 2015
Atila Neves
Apr 04, 2015
Kagamin
Apr 04, 2015
Dicebot
Apr 05, 2015
Atila Neves
Apr 06, 2015
Sergei Nosov
Apr 07, 2015
Dicebot
Apr 07, 2015
Sergei Nosov
Apr 04, 2015
Atila Neves
Apr 03, 2015
Dicebot
Apr 03, 2015
weaselcat
Apr 04, 2015
Jacob Carlborg
Apr 04, 2015
Kagamin
Apr 04, 2015
Dicebot
Apr 05, 2015
Kagamin
Apr 05, 2015
Dicebot
Apr 05, 2015
Kagamin
Apr 07, 2015
Dicebot
Apr 07, 2015
Kagamin
Apr 04, 2015
Atila Neves
Apr 03, 2015
Jacob Carlborg
April 03, 2015
I wanted to work on this a little more before announcing it, but it seems I'm going to be busy working on trying to get unit-threaded into std.experimental so here it is:

http://code.dlang.org/packages/reggae

If you're wondering about the name, it's because it's supposed to build on dub.

You might wonder at some of the design decisions. Some of them are solutions to weird problems caused by writing build descriptions in a compiled language, others I'm not too sure of. Should compiler flags be an array of strings or a string? I got tired of typing square brackets so it's a string for now.

Please let me know if the API is suitable or not, preferably by trying to actually use it to build your software.

Existing dub projects might work by just doing this from a build directory of your choice: "reggae -b make /path/to/project". That should generate a Makefile (or equivalent Ninja ones if `-b ninja` is used) to do what `dub build` usually does. It _should_ work for all dub projects, but it doesn't right now. For at least a few projects it's due to bugs in `dub describe`. For others it might be bugs in reggae, I don't know yet. Any dub.json files that use dub configurations extensively is likely to not work.

Features:

. Make and Ninja backends (tup will be the next one)
. Automatically imports dub projects and writes the reggae build configuration
. Access to all objects to be built with dub (including dependencies) when writing custom builds (reggae does this itself)
. Out-of-tree builds, like CMake
. Arbitrary build rules but pre-built ease-of-use higher level targets
. Separate compilation. One file changes, only one file gets rebuilt
. Automatic dependency detection for D, C, and C++ source files
. Can build itself (but includes too many object files, another `dub describe` bug)

There are several runnable examples in the features directory, in the form of Cucumber tests. They include linking D code to C++.

I submitted a proposal to talk about this at DConf but I'll be talking about testing instead. Maybe next year? Anyway, destroy!

Atila
April 03, 2015
On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> . Separate compilation. One file changes, only one file gets rebuilt

This immediately has caught my eye as huge "no" in the description. We must ban C style separate compilation, there is simply no way to move forward otherwise. At the very least not endorse it in any way.
April 03, 2015
Also I don't see any point in yet another meta build system. The very point of initial discussion was about getting D only cross-platform solution that won't require installing any additional software but working D compiler.
April 03, 2015
On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
> Also I don't see any point in yet another meta build system. The very point of initial discussion was about getting D only cross-platform solution that won't require installing any additional software but working D compiler.

I was also thinking of a binary backend (producing a binary executable that does the build, kinda like what ctRegex does but for builds), and also something that just builds it on the spot.

The thing is, I want to get feedback on the API first and foremost, and delegating the whole do-I-or-do-I-not-need-to-build-it logic to programs that already do that (and well) first was the obvious (for me) choice.

Also, Ninja is _really_ fast.
April 03, 2015
On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:
> On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
>> . Separate compilation. One file changes, only one file gets rebuilt
>
> This immediately has caught my eye as huge "no" in the description. We must ban C style separate compilation, there is simply no way to move forward otherwise. At the very least not endorse it in any way.

I understand that. But:

1. One of D's advantages is fast compilation. I don't think that means we should should compile everything all the time because we can (it's fast anyway!)
2. There are measureable differences in compile-time. While working on reggae I got much faster edit-compile-unittest cycles because of separate compilation
3. This is valuable feedback. I was wondering what everybody else would think. It could be configureable, your "not endorse it in any way" notwithstanding. I for one would rather have it compile separately
4. CTFE and memory consumption can go through the roof (anecdotally anyway, it's never been a problem for me) when compiling everything at once.
April 03, 2015
On Fri, Apr 03, 2015 at 17:10:31 +0000, Dicebot via Digitalmars-d-announce wrote:
> On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
> > . Separate compilation. One file changes, only one file gets rebuilt
> 
> This immediately has caught my eye as huge "no" in the description. We must ban C style separate compilation, there is simply no way to move forward otherwise. At the very least not endorse it in any way.

Why? Other than the -fversion=... stuff, what is really blocking this? I personally find unity builds to not be worth it, but I don't see anything blocking separate compilation for D if dependencies are set up properly.

--Ben
April 03, 2015
On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:
> On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
>> Also I don't see any point in yet another meta build system. The very point of initial discussion was about getting D only cross-platform solution that won't require installing any additional software but working D compiler.
>
> I was also thinking of a binary backend (producing a binary executable that does the build, kinda like what ctRegex does but for builds), and also something that just builds it on the spot.
>
> The thing is, I want to get feedback on the API first and foremost, and delegating the whole do-I-or-do-I-not-need-to-build-it logic to programs that already do that (and well) first was the obvious (for me) choice.
>
> Also, Ninja is _really_ fast.

The thing is, it may actually affect API. The way I have originally expected it, any legal D code would be allowed for build commands instead of pure DSL approach. So instead of providing high level abstraction like this:

const mainObj  = Target("main.o",  "dmd -I$project/src -c $in -of$out", Target("src/main.d"));
const mathsObj = Target("maths.o", "dmd -c $in -of$out", Target("src/maths.d"));
const app = Target("myapp", "dmd -of$out $in", [mainObj, mathsObj]);

.. you instead define dependency building blocks in D domain:

struct App
{
    enum  path = "./myapp";
    alias deps = Depends!(mainObj, mathsObj);

    static void generate()
    {
        import std.process;
        enforce(execute([ "dmd",  "-ofmyapp", deps[0].path, deps[1].path]).status);
    }
}

And provide higher level helper abstractions on top of that, tuned for D projects. This is just random syntax I have just invented for example of course. It is already possible to write decent cross-platform scripts in D - only dependency tracking library is missing. But of course that would make using other build systems as backends impossible.
April 03, 2015
On Friday, 3 April 2015 at 17:25:51 UTC, Ben Boeckel wrote:
> On Fri, Apr 03, 2015 at 17:10:31 +0000, Dicebot via Digitalmars-d-announce wrote:
>> On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
>> > . Separate compilation. One file changes, only one file gets rebuilt
>> 
>> This immediately has caught my eye as huge "no" in the description. We must ban C style separate compilation, there is simply no way to move forward otherwise. At the very least not endorse it in any way.
>
> Why? Other than the -fversion=... stuff, what is really blocking this? I
> personally find unity builds to not be worth it, but I don't see
> anything blocking separate compilation for D if dependencies are set up
> properly.
>
> --Ben

There are 2 big problems with C-style separate compilation:

1)

Complicates whole-program optimization possibilities. Old school object files are simply not good enough to preserve information necessary to produce optimized builds and we are not in position to create own metadata + linker combo to circumvent that. This also applies to attribute inference which has become a really important development direction to handle growing attribute hell.

During last D Berlin Meetup we had an interesting conversation on attribute inference topic with Martin Nowak and dropping legacy C-style separate compilation seemed to be recognized as unavoidable to implement anything decent in that domain.

2)

Ironically, it is just very slow. Those who come from C world got used to using separate compilation to speed up rebuilds but it doesn't work that way in D. It may look better if you change only 1 or 2 module but as amount of modified modules grows, incremental rebuild quickly becomes _slower_ than full program build with all files processed in one go. It can sometimes result in order of magnitude slowdown (personal experience).

Difference from C is that repeated imports are very cheap in D (you don't copy-paste module content again and again like with headers) but at the same time semantic analysis of imported module is more expensive (because D semantics are more complicated). When you do separate compilation you discard already processed imports and repeat it again and again from the very beginning for each new compiled file, accumulating huge slowdown for application in total.

To get best compilation speed in D you want to process as many modules with shared imports at one time as possible. At the same time for really big projects it becomes not feasible at some point, especially if CTFE is heavily used and memory consumption explodes. In that case best approach is partial separate compilation - decoupling parts of a program as static libraries and doing parallel compilation of each separate library - but still compiling each library in one go. That allows to get parallelization without doing the same costly work again and again.
April 03, 2015
On Friday, 3 April 2015 at 17:22:42 UTC, Atila Neves wrote:
> On Friday, 3 April 2015 at 17:10:33 UTC, Dicebot wrote:
>> On Friday, 3 April 2015 at 17:03:35 UTC, Atila Neves wrote:
>>> . Separate compilation. One file changes, only one file gets rebuilt
>>
>> This immediately has caught my eye as huge "no" in the description. We must ban C style separate compilation, there is simply no way to move forward otherwise. At the very least not endorse it in any way.
>
> I understand that. But:
>
> 1. One of D's advantages is fast compilation. I don't think that means we should should compile everything all the time because we can (it's fast anyway!)
> 2. There are measureable differences in compile-time. While working on reggae I got much faster edit-compile-unittest cycles because of separate compilation
> 3. This is valuable feedback. I was wondering what everybody else would think. It could be configureable, your "not endorse it in any way" notwithstanding. I for one would rather have it compile separately
> 4. CTFE and memory consumption can go through the roof (anecdotally anyway, it's never been a problem for me) when compiling everything at once.

See http://forum.dlang.org/post/nhaoahnqucqkjgdwtxsa@forum.dlang.org

tl; dr: separate compilation support is necessary, but not at single module level.
April 03, 2015
On Friday, 3 April 2015 at 17:40:42 UTC, Dicebot wrote:
> On Friday, 3 April 2015 at 17:17:50 UTC, Atila Neves wrote:
>> On Friday, 3 April 2015 at 17:13:41 UTC, Dicebot wrote:
>>> Also I don't see any point in yet another meta build system. The very point of initial discussion was about getting D only cross-platform solution that won't require installing any additional software but working D compiler.
>>
>> I was also thinking of a binary backend (producing a binary executable that does the build, kinda like what ctRegex does but for builds), and also something that just builds it on the spot.
>>
>> The thing is, I want to get feedback on the API first and foremost, and delegating the whole do-I-or-do-I-not-need-to-build-it logic to programs that already do that (and well) first was the obvious (for me) choice.
>>
>> Also, Ninja is _really_ fast.
>
> The thing is, it may actually affect API. The way I have originally expected it, any legal D code would be allowed for build commands instead of pure DSL approach. So instead of providing high level abstraction like this:
>
> const mainObj  = Target("main.o",  "dmd -I$project/src -c $in -of$out", Target("src/main.d"));
> const mathsObj = Target("maths.o", "dmd -c $in -of$out", Target("src/maths.d"));
> const app = Target("myapp", "dmd -of$out $in", [mainObj, mathsObj]);
>
> .. you instead define dependency building blocks in D domain:
>
> struct App
> {
>     enum  path = "./myapp";
>     alias deps = Depends!(mainObj, mathsObj);
>
>     static void generate()
>     {
>         import std.process;
>         enforce(execute([ "dmd",  "-ofmyapp", deps[0].path, deps[1].path]).status);
>     }
> }
>
> And provide higher level helper abstractions on top of that, tuned for D projects. This is just random syntax I have just invented for example of course. It is already possible to write decent cross-platform scripts in D - only dependency tracking library is missing. But of course that would make using other build systems as backends impossible.

Well, I took your advice (and one of my acceptance tests is based off of your simplified real-work example) and started with the low-level any-command-will-do API first. I built the high-level ones on top of that. It doesn't seem crazy to me that certain builds can only be done by certain backends. The fact that the make backend can track C/C++/D dependencies wasn't a given and the implementation is quite ugly.

In any case, the Target structs aren't high-level abstractions, they're just data. Data that can be generated by any code. Your example is basically how the `dExe` rule works: run dmd at run-time, collect dependencies and build all the `Target` instances. You could have a D backend that outputs (then compiles and runs) your example. The "only" problem I can see is execution speed.

Maybe I didn't include enough examples.

I also need to think of your example a bit more.
« First   ‹ Prev
1 2 3 4