June 16, 2016
On Thursday, 16 June 2016 at 04:26:24 UTC, Jason White wrote:
> On Wednesday, 15 June 2016 at 12:00:52 UTC, Andrei Alexandrescu wrote:
>> I'd say the gating factor is -j. If an build system doesn't implement the equivalent of make -j, that's a showstopper.
>
> Don't worry, there is a --threads option and it defaults to the number of logical cores.
>
> I just did some tests and the reason it is slower than Make is because of the automatic dependency detection on every single command. I disabled the automatic dependency detection and compared it with Make again. Button was then roughly the same speed as Make -- sometimes it was faster, sometimes slower. Although, I think getting accurate dependencies at the cost of slightly slower builds is very much a worthwhile trade-off.

It would be a worthwhile trade-off, if those were the only two options available, but they're not. There are multiple build systems out there that do correct builds whilst being faster than make. Being faster is easy, because make is incredibly slow.

I didn't even find out about ninja because I read about it in a blog post, I actively searched for a make alternative because I was tired of waiting for it.

Atila


June 16, 2016
On Thursday, 16 June 2016 at 12:32:02 UTC, Kagamin wrote:
> On Sunday, 12 June 2016 at 20:47:31 UTC, cym13 wrote:
>>> Yeah, I have often thought that writing a self-contained D program to build D would work well. The full power of the language would be available, there'd be nothing new to learn, and all you'd need is an existing D compiler (which we already require to build).
>>
>> What about Attila's work with reggae?
>
> Reggae still needs a prebuilt reggae to run the build script.

The idea would be to build reggae with the system dmd first (since having a D compiler is now a pre-requisite), then build dmd, druntime and phobos.

There are no extra dependencies except on the reggae source files. Again, that's the idea, at least.

Atila
June 16, 2016
On Thursday, 16 June 2016 at 13:40:39 UTC, Atila Neves wrote:
> The idea would be to build reggae with the system dmd first (since having a D compiler is now a pre-requisite)

If a D compiler is required, it means a prebuilt executable is not needed: rdmd should be enough to compile and run the build script.
June 17, 2016
On Thursday, 16 June 2016 at 12:34:26 UTC, Kagamin wrote:
> On Sunday, 12 June 2016 at 23:27:07 UTC, Jason White wrote:
>> However, I question the utility of even doing this in the first place. You miss out on the convenience of using the existing command line interface.
>
> Why the build script can't have a command line interface?

It could, but now the build script is a more complicated and for little gain. Adding command line options on top of that to configure the build would be painful.

It would be simpler and cleaner to write a D program to generate the JSON build description for Button to consume. Then you can add a command line interface to configure how the build description is generated. This is how the Lua build descriptions work[1].

[1] http://jasonwhite.github.io/button/docs/tutorial#going-meta-building-the-build-description
June 17, 2016
On Thursday, 16 June 2016 at 13:39:20 UTC, Atila Neves wrote:
> It would be a worthwhile trade-off, if those were the only two options available, but they're not. There are multiple build systems out there that do correct builds whilst being faster than make. Being faster is easy, because make is incredibly slow.
>
> I didn't even find out about ninja because I read about it in a blog post, I actively searched for a make alternative because I was tired of waiting for it.

Make is certainly not slow for full builds. That is what I was testing.

I'm well aware of Ninja and it is maybe only 1% faster than Make for full builds[1]. There is only so much optimization that can be done when spawning processes as dictated by a DAG. 99% of the CPU's time is spent on running the tasks themselves.

Where Make gets slow is when checking for changes on a ton of files. I haven't tested it, but I'm sure Button is faster than Make in this case because it checks for changed files using multiple threads. Using the file system watcher can also bring this down to a near-zero time.

Speed is not the only virtue of a build system. A build system can be amazeballs fast, but if you can't rely on it doing incremental builds correctly in production, then you're probably doing full builds every single time. Being easy to use and robust is also pretty important.

[1] http://hamelot.io/programming/make-vs-ninja-performance-comparison/
June 16, 2016
On Fri, Jun 17, 2016 at 05:41:30AM +0000, Jason White via Digitalmars-d-announce wrote: [...]
> Where Make gets slow is when checking for changes on a ton of files. I haven't tested it, but I'm sure Button is faster than Make in this case because it checks for changed files using multiple threads. Using the file system watcher can also bring this down to a near-zero time.

IMO using the file system watcher is the way to go. It's the only way to beat the O(n) pause at the beginning of a build as the build system scans for what has changed.


> Speed is not the only virtue of a build system. A build system can be amazeballs fast, but if you can't rely on it doing incremental builds correctly in production, then you're probably doing full builds every single time. Being easy to use and robust is also pretty important.
[...]

For me, correctness is far more important than speed. Mostly because at my day job, we have a Make-based build system and because of Make's weaknesses, countless hours, sometimes even days, have been wasted running `make clean; make` just so we can "be sure".  Actually, it's worse than that; the "official" way to build it is:

	svn diff > /tmp/diff
	\rm -rf old_checkout
	mkdir new_checkout
	cd new_checkout
	svn co http://svnserver/path/to/project
	patch -p0 </tmp/diff
	make

because we have been bitten before by `make clean` not *really* cleaning *everything*, and so `make clean; make` was actually producing a corrupt image, whereas checking out a fresh new workspace produces the correct image.

Far too much time has been wasted "debugging" bugs that weren't really there, just because Make cannot be trusted to produce the correct results. Or heisenbugs that disappear when you rebuild from scratch. Unfortunately, due to the size of our system, a fresh svn checkout on a busy day means 15-20 mins (due to everybody on the local network trying to do fresh checkouts!), then make takes about 30-45 mins to build everything.  When your changeset touches Makefiles, this could mean a 1 hour turnaround for every edit-compile-test cycle, which is ridiculously unproductive.

Such unworkable turnaround times, of course, causes people to be lazy and just run tests on incremental builds (of unknown correctness), which results in people checking in changesets that are actually wrong but just happen to work when they were testing on an incremental build (thanks to Make picking up stray old copies of obsolete libraries or object files or other such detritus). Which means *everybody*'s workspace breaks after running `svn update`. And of course, nobody is sure whether it broke because of their own changes, or because somebody checked in a bad changeset; so it's `make clean; make` time just to "be sure". That's n times how many man-hours (for n = number of people on the team) straight down the drain, where had the build system actually been reliable, only the person responsible would have to spend a few extra hours to fix the problem.

Make proponents don't seem to realize how a seemingly not-very-important feature as build correctness actually adds up to a huge cost in terms of employee productivity, i.e., wasted hours, AKA wasted employee wages for the time spent watching `make clean; make` run.


T

-- 
It is of the new things that men tire --- of fashions and proposals and improvements and change. It is the old things that startle and intoxicate. It is the old things that are young. -- G.K. Chesterton
June 17, 2016
On Friday, 17 June 2016 at 06:18:28 UTC, H. S. Teoh wrote:
> For me, correctness is far more important than speed. Mostly because at my day job, we have a Make-based build system and because of Make's weaknesses, countless hours, sometimes even days, have been wasted running `make clean; make` just so we can "be sure".  Actually, it's worse than that; the "official" way to build it is:
>
> 	svn diff > /tmp/diff
> 	\rm -rf old_checkout
> 	mkdir new_checkout
> 	cd new_checkout
> 	svn co http://svnserver/path/to/project
> 	patch -p0 </tmp/diff
> 	make
>
> because we have been bitten before by `make clean` not *really* cleaning *everything*, and so `make clean; make` was actually producing a corrupt image, whereas checking out a fresh new workspace produces the correct image.
>
> Far too much time has been wasted "debugging" bugs that weren't really there, just because Make cannot be trusted to produce the correct results. Or heisenbugs that disappear when you rebuild from scratch. Unfortunately, due to the size of our system, a fresh svn checkout on a busy day means 15-20 mins (due to everybody on the local network trying to do fresh checkouts!), then make takes about 30-45 mins to build everything.  When your changeset touches Makefiles, this could mean a 1 hour turnaround for every edit-compile-test cycle, which is ridiculously unproductive.
>
> Such unworkable turnaround times, of course, causes people to be lazy and just run tests on incremental builds (of unknown correctness), which results in people checking in changesets that are actually wrong but just happen to work when they were testing on an incremental build (thanks to Make picking up stray old copies of obsolete libraries or object files or other such detritus). Which means *everybody*'s workspace breaks after running `svn update`. And of course, nobody is sure whether it broke because of their own changes, or because somebody checked in a bad changeset; so it's `make clean; make` time just to "be sure". That's n times how many man-hours (for n = number of people on the team) straight down the drain, where had the build system actually been reliable, only the person responsible would have to spend a few extra hours to fix the problem.
>
> Make proponents don't seem to realize how a seemingly not-very-important feature as build correctness actually adds up to a huge cost in terms of employee productivity, i.e., wasted hours, AKA wasted employee wages for the time spent watching `make clean; make` run.

I couldn't agree more! Correctness is by far the most important feature of a build system. Second to that is probably being able to make sense of what is happening.

I have the same problems as you in my day job, but magnified. Some builds take 3+ hours, some nearly 24 hours, and none of the developers can run full builds themselves because the build process is so long and complicated. Turn-around time to test changes is abysmal and everyone is probably orders of magnitude more unproductive because of it. All of this because we can't trust Make or Visual Studio to do incremental builds correctly.

I hope to change that with Button.
June 17, 2016
On Friday, 17 June 2016 at 05:41:30 UTC, Jason White wrote:
> On Thursday, 16 June 2016 at 13:39:20 UTC, Atila Neves wrote:
>> It would be a worthwhile trade-off, if those were the only two options available, but they're not. There are multiple build systems out there that do correct builds whilst being faster than make. Being faster is easy, because make is incredibly slow.
>>
>> I didn't even find out about ninja because I read about it in a blog post, I actively searched for a make alternative because I was tired of waiting for it.
>
> Make is certainly not slow for full builds. That is what I was testing.

I only care about incremental builds. I actually have difficulty understanding why you tested full builds, they're utterly uninteresting to me.

> A build system  can be amazeballs fast, but if you can't rely on it doing incremental builds correctly in production, then you're probably doing full builds every single time. Being easy to use and robust is also pretty important.

I agree, but CMake/ninja, tup, regga/ninja, reggae/binary are all correct _and_ fast.

Atila
June 17, 2016
On Friday, 17 June 2016 at 06:18:28 UTC, H. S. Teoh wrote:
> On Fri, Jun 17, 2016 at 05:41:30AM +0000, Jason White via Digitalmars-d-announce wrote: [...]
>> Where Make gets slow is when checking for changes on a ton of files. I haven't tested it, but I'm sure Button is faster than Make in this case because it checks for changed files using multiple threads. Using the file system watcher can also bring this down to a near-zero time.
>
> IMO using the file system watcher is the way to go. It's the only way to beat the O(n) pause at the beginning of a build as the build system scans for what has changed.

See, I used to think that, then I measured. tup uses fuse for this and that's exactly why it's fast. I was considering a similar approach with the reggae binary backend, and so I went and timed make, tup, ninja and itself on a synthetic project. Basically I wrote a program to write out source files to be compiled, with a runtime parameter indicating how many source files to write.

The most extensive tests I did was on a synthetic project of 30k source files. That's a lot bigger than the vast majority of developers are ever likely to work on. As a comparison, the 2.6.11 version of the Linux kernel had 17k files.

A no-op build on my laptop was about (from memory):

tup: <1s
ninja, binary: 1.3s
make: >20s

It turns out that just stat'ing everything is fast enough for pretty much everybody, so I just kept the simple algorithm. Bear in mind the Makefiles here were the simplest possible - doing anything that usually goes on in Makefileland would have made it far, far slower. I know: I converted a build system at work from make to hand-written ninja and it no-op builds went from nearly 2 minutes to 1s.

If you happen to be unlucky enough to work on a project so large you need to watch the file system, then use the tup backend I guess.

Atila

June 17, 2016
On Friday, 17 June 2016 at 04:54:37 UTC, Jason White wrote:
>> Why the build script can't have a command line interface?
>
> It could, but now the build script is a more complicated and for little gain.

It's only as complicated to implement required features and not more complicated. If the command line interface is not needed, it can be omitted, example:
---
import button;
auto Build = ...
mixin mainBuild!Build; //no CLI
---

> Adding command line options on top of that to configure the build would be painful.

$ rdmd build.d configure [options]

Well, if one wants to go really complex, a prebuilt binary can be provided to help with that, but it's not always needed I think.

> It would be simpler and cleaner to write a D program to generate the JSON build description for Button to consume. Then you can add a command line interface to configure how the build description is generated. This is how the Lua build descriptions work[1].

---
import button;
auto Build = ...
mixin mainBuildJSON!Build;
---
Should be possible to work like lua script.