December 10, 2018
On Mon, Dec 10, 2018 at 10:30 AM Neia Neutuladh via Digitalmars-d-announce <digitalmars-d-announce@puremagic.com> wrote:
>
> I wrote a post about language-agnostic (or, more accurately, cross- language) build tools, primarily using D as an example and Dub as a benchmark.
>
> Spoiler: dub wins in speed, simplicity, dependency management, and actually working without modifying the tool's source code.
>
> https://blog.ikeran.org/?p=339

Why isn't premake in the list? It's the only buildtool that works
reasonably well with IDE's, and it's had D well supported for almost
6-7 years.
It also doesn't depend on a horrible runtime language distro.
December 11, 2018
On Mon, 2018-12-10 at 13:01 -0800, H. S. Teoh via Digitalmars-d-announce wrote:
> 
[…]
> Wow.  Thanks for the writeup that convinces me that I don't need to waste time looking at Meson/Ninja.
[…]

The article is a personal opinion and that is fine. For me it is wrong. No mention of SCons, nor that Gradle build C++ as well as for the JVM languages. Some of the points about Meson are right, some wrong, but it is a personal opinion and that is fine.

I shall continue to use Meson and Ninja because they are way, way better than Autotools (not mentioned but still used a lot) and better than SCons for many use cases. But this is also a personal opinion.

-- 
Russel.
===========================================
Dr Russel Winder      t: +44 20 7585 2200
41 Buckmaster Road    m: +44 7770 465 077
London SW11 1EN, UK   w: www.russel.org.uk



December 11, 2018
On Monday, 10 December 2018 at 18:27:48 UTC, Neia Neutuladh wrote:
> I wrote a post about language-agnostic (or, more accurately, cross- language) build tools, primarily using D as an example and Dub as a benchmark.
>
> Spoiler: dub wins in speed, simplicity, dependency management, and actually working without modifying the tool's source code.
>
> https://blog.ikeran.org/?p=339

No reggae? https://github.com/atilaneves/reggae/

dub is simple and has dependency management, and that's about it. Speed? It's as slow as molasses and hits the network every time unless explicitly told not to. Never mind if there's already a dub.selections.json file and all of the dependencies are fetched (which it could check but doesn't).

Trying to do anything non-trivial in dub is a exercise in frustration. The problem is that it's the de facto D package manager, so as soon as you have dependencies you need dub whether you want to or not.

dub works great if you're writing an executable with some dependencies and hardly any other needs. After that...
December 11, 2018
On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:
> On Mon, 10 Dec 2018 21:53:40 +0000, GoaLitiuM wrote:
>> The results for touching second file seems like an anomaly to me,
>
> The generated ninja file had one rule per source file. If your modules tend to import each other a lot, or if they transitively import the code that's doing expensive stuff, then one rule per source file is bad. If your modules have few transitive dependencies and they're each fast to compile, one rule per source file is good.

In typical D code, it's usually faster to compile per package than either all-at-once or per module. Which is why it's the default in reggae.

> My project used Pegged, and a lot of stuff referenced the grammar. That meant incremental builds went long and it would have been better to build the whole project at once.
>
> Separating the grammar into a different build would reduce compile times significantly, and that might make incremental builds fast.

Using Pegged basically requires a dub subpackage with the grammar to retain one's sanity.

> From discussions on IRC about reducing compile times, though, using Phobos is a good way to get slow compilation, and I use Phobos. That alone means incremental builds are likely to go long.

Yes. Especially with -unittest.

December 11, 2018
Am 10.12.2018 um 22:01 schrieb H. S. Teoh:
> (...)
> 
> Convenience and simplicity, sure.  But speed? I'm sorry to say, I tried
> dub for 2 days and gave up in frustration because it was making my
> builds *several times longer* than a custom SCons script.  I find that
> completely unacceptable.
> 
> It also requires network access.  On *every* invocation, unless
> explicitly turned off.  And even then, it performs time-consuming
> dependency resolutions on every invocation, which doubles or triples
> incremental build times.  Again, unacceptable.

The upgrade check has been disabled in one of the latest releases, so unless the dependencies haven't been resolved before, it will not access the network anymore. A notable exception are single-file packages, which don't have a dub.selections.json - we should probably do something about this, too, at some point.

I've also rewritten the dependency resolution a while ago and it usually is not noticeable anymore nowadays.

Then there was an issue where LDC was invoked far too frequently to determine whether it outputs COFF files or not, making it look like scanning the file system for changes took unacceptably long. This has also been fixed.

The main open point right now AFAICS is to make --parallel work with the multiple-files-at-once build modes for machines that have enough RAM. This is rather simple, but someone has to do it. But apart from that, I think that the current state is relatively fine form a performance point of view.

> 
> Then it requires a specific source layout, with incomplete /
> non-existent configuration options for alternatives.  Which makes it
> unusable for existing code bases.  Unacceptable.

You can define arbitrary import/source directories and list (or delist) source files individually if you want. There are restrictions on the naming of the output binary, though, is that what you mean?

> Worst of all, it does not support custom build actions, which is a
> requirement for many of my projects.  It does not support polyglot
> projects. It either does not support explicit control over exact build
> commands, or any such support is so poorly documented it might as well
> not exist.  This is not only unacceptable, it is a show-stopper.

Do you mean modifying the compiler invocations that DUB generates or adding custom commands (aka pre/post build/generate commands)?
December 11, 2018
On Tue, Dec 11, 2018 at 09:58:39AM +0000, Atila Neves via Digitalmars-d-announce wrote:
> On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:
[...]
> In typical D code, it's usually faster to compile per package than either all-at-once or per module. Which is why it's the default in reggae.

Yeah, for projects past a certain size, compiling per package makes the most sense.


[...]
> > From discussions on IRC about reducing compile times, though, using Phobos is a good way to get slow compilation, and I use Phobos. That alone means incremental builds are likely to go long.
> 
> Yes. Especially with -unittest.

We've talked about this before.  Jonathan Marler actually ran a test and discovered that it wasn't something *directly* to do with unittests; the performance hit was coming from some unexpected interactions with the way the compiler instantiates templates when -unittest is enabled.  I don't remember what the conclusion was, though.

Either way, the unittest problem needs to be addressed.  I've been running into problems with compiling my code with -unittest, because it causes ALL unittests of ALL packages to be compiled, including Phobos and external libraries.  It's making it very hard to manage exactly what is unittested -- I want to unittest my *own* code, not any 3rd party libraries or Phobos, but right now, there's no way to control that.

Recently I ran into a roadblock with -unittest: I have a project with rather extensive unittests, but it assumes certain things about the current working directory and the current environment (because those unittests are run from a special unittest driver). I have that project as a git submodule in a different project for experimental purposes, but now I can't compile with -unittest because the former project's unittests will fail, not being run in the expected environment. :-(

There needs to be a more fine-grained way of controlling which unittests get compiled.  Generally, I don't see why I should care about unittests for external dependencies (including Phobos) when what I really want is to test the *current* project's code.


T

-- 
The two rules of success: 1. Don't tell everything you know. -- YHL
December 11, 2018
On Tue, Dec 11, 2018 at 09:54:06AM +0000, Atila Neves via Digitalmars-d-announce wrote: [...]
> No reggae? https://github.com/atilaneves/reggae/

I recently finally sat down and took a look at Button, posted here a few years ago.  It looked pretty good.  One of these days I really need to sit down and take a good look at reggae.


> dub is simple and has dependency management, and that's about it. Speed?  It's as slow as molasses and hits the network every time unless explicitly told not to. Never mind if there's already a dub.selections.json file and all of the dependencies are fetched (which it could check but doesn't).

According to Sönke's post elsewhere in this thread, these performance issues have been addressed in the latest version.  I haven't tried it out to verify that yet, though.


> Trying to do anything non-trivial in dub is a exercise in frustration. The problem is that it's the de facto D package manager, so as soon as you have dependencies you need dub whether you want to or not.

After fighting with dub for 2 days (or was it a week? it certainly felt longer :-P) in my vibe.d project, I ended up just creating an empty dummy project in a subdirectory that declares a dependency on vibe.d, and run dub separately to fetch and build vibe.d, then I ignore the rest of the dummy project and go back to the real project root and have SCons build the real executable for me.  So far, that has worked reasonably well, besides the occasional annoyance of having to re-run dub to update to the latest vibe.d packages.


> dub works great if you're writing an executable with some dependencies and hardly any other needs. After that...

Yeah.  Being unable to handle generated source files is a showstopper for many of my projects.  As Neia said, while D has some very nice compile-time codegen features, sometimes you really just need to write an external utility that generates source code.

For example, one of my current projects involves parsing GLSL source files and generating D wrapper code as syntactic sugar for calls to glUniform* and glAttrib* (so that I can just say `myshader.color = Vector(1, 2, 3);` instead of manually calling glUniform* with fiddly, error-prone byte offsets.  While in theory I could use string imports and CTFE to do this, it's far less hairy to do this as an external step.

Most build systems with automatic dependency extraction would fail when given this sort of setup, because they generally depend on scanning directory contents, but in this case the file may not have been generated yet (it would not be generated until the D code of the tool that generates it is first compiled, then run). So the dependency would be missed, resulting either in intermittent build failure or failure to recompile dependents when the generated code changes.  It's not so simple to just do codegen as a special preprocessing step -- such tasks need to be treated as 1st class dependency tasks and handled natively as part of DAG resolution, not as something tacked on as an afterthought.


T

-- 
Music critic: "That's an imitation fugue!"
December 11, 2018
On 12/11/18 12:39 PM, H. S. Teoh wrote:
> On Tue, Dec 11, 2018 at 09:58:39AM +0000, Atila Neves via Digitalmars-d-announce wrote:
>> On Monday, 10 December 2018 at 22:18:28 UTC, Neia Neutuladh wrote:
> [...]
>> In typical D code, it's usually faster to compile per package than
>> either all-at-once or per module. Which is why it's the default in
>> reggae.
> 
> Yeah, for projects past a certain size, compiling per package makes the
> most sense.
> 
> 
> [...]
>>>  From discussions on IRC about reducing compile times, though, using
>>> Phobos is a good way to get slow compilation, and I use Phobos. That
>>> alone means incremental builds are likely to go long.
>>
>> Yes. Especially with -unittest.
> 
> We've talked about this before.  Jonathan Marler actually ran a test and
> discovered that it wasn't something *directly* to do with unittests; the
> performance hit was coming from some unexpected interactions with the
> way the compiler instantiates templates when -unittest is enabled.  I
> don't remember what the conclusion was, though.

I remember:

1. When unittests are enabled, -allinst is enabled as well.
2. This means that all templates instantiated are included as if they were part of the local module.
3. This means that they are semantically analyzed, and if they import anything, all those imports are processed as well
4. Recurse on step 2.

Note that the reason allinst is used is because sometimes templates compile differently when unittests are enabled. In other words, you might for instance get a different struct layout for when unittests are enabled -- this prevents that (but only for templates of course).

The ultimate reason why the PR (which removed the -allinst flag for unittests) was failing was because of differences in compiler flags for different modules during unittests in Phobos. This caused symbol name mangling changes (IIRC, mostly surrounding dip1000 problems).

I really wish we could have followed through on that PR...

-Steve
December 11, 2018
On Tue, Dec 11, 2018 at 01:56:24PM -0500, Steven Schveighoffer via Digitalmars-d-announce wrote: [...]
> 1. When unittests are enabled, -allinst is enabled as well.
> 2. This means that all templates instantiated are included as if they
> were part of the local module.
> 3. This means that they are semantically analyzed, and if they import
> anything, all those imports are processed as well
> 4. Recurse on step 2.
> 
> Note that the reason allinst is used is because sometimes templates compile differently when unittests are enabled. In other words, you might for instance get a different struct layout for when unittests are enabled -- this prevents that (but only for templates of course).
> 
> The ultimate reason why the PR (which removed the -allinst flag for unittests) was failing was because of differences in compiler flags for different modules during unittests in Phobos. This caused symbol name mangling changes (IIRC, mostly surrounding dip1000 problems).
> 
> I really wish we could have followed through on that PR...
[...]

Argh.  Another badly needed fix stuck in PR limbo. :-( :-( :-(  Some
days, things like these really make me wish D3 was a thing.

Is there some way of recording this info somewhere, probably bugzilla I guess, so that it will get addressed at *some point*, rather than forgotten forever?  I was hoping this issue would be addressed within the next few releases, but hope seems slim now. :-(


T

-- 
"The number you have dialed is imaginary. Please rotate your phone 90 degrees and try again."
December 11, 2018
On Tue, Dec 11, 2018 at 11:26:45AM +0100, Sönke Ludwig via Digitalmars-d-announce wrote: [...]
> The upgrade check has been disabled in one of the latest releases, so unless the dependencies haven't been resolved before, it will not access the network anymore. A notable exception are single-file packages, which don't have a dub.selections.json - we should probably do something about this, too, at some point.
> 
> I've also rewritten the dependency resolution a while ago and it usually is not noticeable anymore nowadays.
> 
> Then there was an issue where LDC was invoked far too frequently to determine whether it outputs COFF files or not, making it look like scanning the file system for changes took unacceptably long. This has also been fixed.

This is very encouraging to hear.  Thanks!


> The main open point right now AFAICS is to make --parallel work with the multiple-files-at-once build modes for machines that have enough RAM. This is rather simple, but someone has to do it. But apart from that, I think that the current state is relatively fine form a performance point of view.

Wait, what does --parallel do if it doesn't compile multiple files at once?


> > Then it requires a specific source layout, with incomplete / non-existent configuration options for alternatives.  Which makes it unusable for existing code bases.  Unacceptable.
> 
> You can define arbitrary import/source directories and list (or delist) source files individually if you want. There are restrictions on the naming of the output binary, though, is that what you mean?

Is this documented? I couldn't find any info on it the last time I looked.

Also, you refer to "the output binary". Does that mean I cannot generate multiple executables? 'cos that's a showstopper for me.


> > Worst of all, it does not support custom build actions, which is a requirement for many of my projects.  It does not support polyglot projects. It either does not support explicit control over exact build commands, or any such support is so poorly documented it might as well not exist.  This is not only unacceptable, it is a show-stopper.
> 
> Do you mean modifying the compiler invocations that DUB generates or adding custom commands (aka pre/post build/generate commands)?

Does dub support the following scenario?

- There's a bunch of .java files that have to be compiled with javac.
   - But some of the .java files are generated by an external tool, that
     must be run first, before the .java files are compiled.
- There's a bunch of .d files in two directories.
   - The second directory contains .d files that need to be compiled
     into multiple executables, and they must be compiled with a local
     (i.e., non-cross) compiler.
   - Some of the resulting executables must be run first in order to
     generate a few .d files in the first directory (in addition to
     what's already there).
   - After the .d files are generated, the first directory needs to be
     compiled TWICE: once with a cross-compiler (LDC, targetting
     Arm/Android), once with the local D compiler. The first compilation
     must link with cross-compilation Android runtime libraries, and the
     second compilation must link with local X11 libraries.
      - (And obviously, the build products must be put in separate
        subdirectories to prevent stomping over each other.)
- After the .java and .d files are compiled, a series of tools must be
  invoked to generate an .apk file, which also includes a bunch of
  non-code files in resource subdirectories.  Then, another tool must be
  run to align and sign the .apk file.

And here's a critical requirement: any time a file is changed (it can be a .java file, a .d file, or one of the resources that they depend on), all affected build products must be correctly updated. This must be done as efficiently as possible, because it's part of my code-compile-test cycle, and if it requires more than a few seconds or recompiling the entire codebase, it's a no-go.

If dub can handle this, then I'm suitably impressed, and retract most of my criticisms against it. ;-)


T

-- 
Study gravitation, it's a field with a lot of potential.