February 18, 2016
On Thursday, 18 February 2016 at 16:47:16 UTC, Márcio Martins wrote:
> On Thursday, 18 February 2016 at 15:36:42 UTC, Jonathan M Davis wrote:
>> On Thursday, 18 February 2016 at 14:23:12 UTC, Márcio Martins wrote:
>>> I agree, but I don't see why this would have to change. It shouldn't change. Frontend development could happen on DMD as the *reference* compiler.
>>
>> And what exactly is the difference between the "official" compiler and the "reference" compiler supposed to be?
>>
>> - Jonathan M Davis
>
> "official" carries a connotation of endorsement doesn't it? In other words, if you are given a choice of 3 and you know very little about each, which would you predict would give you a better user experience?
>
> Reference in this case is the one that most closely follows the bleeding edge of the language spec, which new users don't benefit a lot from. In this case it's also where all the frontend development would happen. But what we call it this doesn't really matter to end users.
>
> What I have been defending this far is that we could entertain the possibility that end users could be better off if we "suggested" they try out one of the other compilers before they try DMD. The easiest way to suggest that is to stamp "official" on one of the stronger alternatives. Once installers for LDC and GDC are on par with DMD, is there still a pragmatic reason to suggest DMD to new users? Given that all that DMD has going for it from the perspective of a new user is the compilation speed?
>
> For everyone else nothing would change, we'd go about our daily lives, using our favorite compiler as always. But meanwhile, people exploring and looking to try D could try out it's amazing features and get proof in first-hand, that these awesome features come at no efficiency cost, as advertised.

Honestly, I think that dmd _should_ be the goto compiler. It's the fast one. It's the most up-to-date - especially right now as gdc and ldc have been trying to get to the point that they're using the new D frontend instead of the C++ one. gdc and ldc are great if you want to make sure that you're code is faster in production, but they're slower for actually get the code written, and AFAIK, if you want to be writing scripts in D (which is really useful), you need rdmd, which means using dmd (and I sure wouldn't want those to be compiled with gdc or ldc anyway - compilation speed matters way more in that case than it even does during development).

New users are frequently impressed by how fast dmd compiles code, and it's a big selling point for us. It's only later that benchmarking comes into play, and if want to do that, then use gdc or ldc. The download page already says to use gdc or ldc if you want better optimization.

I wouldn't want to use gdc or ldc for normal development unless I had to, and I wouldn't want to encourage others to either. dmd's speed is worth way too much when it comes to getting actual work done. And it's not like it generates slow code. It just doesn't generate code that's as fast as gdc or ldc generates, and when you get to the point that you need the fast binary, then use gdc or ldc. But use them as the default? Why? dmd is a clear winner as far as development goes. It's both faster and more up-to-date. It's just that gdc or ldc would be better for production code if you really need all the speed that you can get.

We need to work towards getting and keeping gdc and ldc in sync with dmd so that they stop being behind like they typically are, and we do need to make sure that it's clear that gdc and ldc generate faster binaries. But I think that it would be a huge mistake to push either of them as the one that everyone should be using by default.

- Jonathan M Davis
February 18, 2016
On Thursday, 18 February 2016 at 17:52:10 UTC, Kai Nacke wrote:
> I really like the compiler diversity. What I miss (hint!) is a program to verify the compiler/backend correctness. Just generate a random D program, compile with all 3 compilers and compare the output. IMHO we could find a lot of backend bugs this way. This would help all D compilers.

That would really be cool.

- Jonathan M Davis
February 18, 2016
On Thursday, 18 February 2016 at 17:56:32 UTC, Jonathan M Davis wrote:
>
> Honestly, I think that dmd _should_ be the goto compiler. [snip]

I agree with your response.

That being said, it can't hurt to make things a bit more clear for new users. If you go to the download page, there is a more information button that takes you to the wiki. But the wiki page seems to just look like raw html. It doesn't look nearly as good as the page we are coming from.

Assuming that is fixed, I would recommend two small other changes:
1) Put the first line from the wiki where it says "If you're a beginner DMD is the recommended choice, ..." on the top of the compiler page,
2) Replace the GDC and LDC "Strong optimization" lines in the download page with something a little clearer. Even "Stronger optimization than DMD" would be clearer.
February 18, 2016
On Thursday, 18 February 2016 at 15:58:14 UTC, Nick Sabalausky wrote:
>  but for the second group, a language that's fast like C/C++ but not nearly as unproductive IS appealing, and even seems to be something they're often on the lookout for.

I would agree with you if you could write D code using the most convenient style possible, compile using LDC, and get the same speed as C or C++. My experience suggests that is not going to happen. You're going to need some knowledge of the language to get the best performance.
February 18, 2016
On Thursday, 18 February 2016 at 17:52:10 UTC, Kai Nacke wrote:
> I really like the compiler diversity. What I miss (hint!) is a program to verify the compiler/backend correctness. Just generate a random D program, compile with all 3 compilers and compare the output. IMHO we could find a lot of backend bugs this way. This would help all D compilers.
>
> Regards,
> Kai

reminds me of csmith
 https://embed.cs.utah.edu/csmith/

I believe Brian Schott had worked on something like this for D... Did that ever go anywhere?
February 18, 2016
On Thursday, 18 February 2016 at 17:52:10 UTC, Kai Nacke wrote:
> On Thursday, 18 February 2016 at 12:16:49 UTC, Radu wrote:
>> As a casual user of the language I see that there is a fragmentation of resources and a waste in this regard with people developing in mainline, then some of you LDC guys catching up.
>
> As Iain already pointed out the main problem is (undocumented or weird) AST changes. This makes a merge sometimes painful. This can (and will) go better.
>
> This is IMHO the only "waste". Nobody of the LDC team does frontend development, we are all focused on the glue layer.
>
>> My simple assumption is that if presumably the dmd backend is not maintained anymore, a lot of the core dmd people can focus on improving whatever problems the frontend or glue layers have.
>
> As far as I know only Walter (and Daniel I think) work on the backend. This is not "a lot of the core dmd people".
>
>> This could only mean that you core LDC guys could focus on llvm backend optimizations (both code gen and performance related). I'm going to assume that those kind of performance optimizations are also constantly done by upstream llvm, so more win here.
>
> By chance I am an LLVM committer, too. But the LDC team only focuses on getting the glue library and the runtime library right. Adding new useful optimizations is hard work. The people working on it are either researchers or backed by a big company.
>
>> Users will not magically turn to contributors if their perception is that there is always going to be a catch-up game to play somewhere. Not to mention that if one want's to get something in LDC, one has to commit it in mainline, which is DMD, you just multiplied the know-how someone needs to have to do some useful work...
>
> It depends on the feature you want. If you want a new language feature then yes. But then you do not change LDC, you change the language specification and therefore the reference compiler.
>
> You can add a lot of features without ever touching DMD frontend code. The sanitizers, for example. Or the not-yet-merged PR for profile-guided optimizations.
>
>> And finally, just pointing people to ldc/gdc (always a version or 2 behind, another grief) each time dmd performance is poor, looks awfully wrong.
>
> I really find this "speed" argument doubtful. My experience is that if you really need performance you must *know* what you are doing. Just picking some code from a web site, compiling it and then complaining that the resulting binary is slower than that of language xy is not a serious approach.
>
> For a novice user, LDC can be discouraging: just type ldc2 -help-hidden. But you may need to know about these options to e.g. enable the right auto-vectorizer for your problem.
>
> I once wrote an MD5 implementation in pure Java which was substantially faster than the reference implementation in C from RFC 1321 (gcc -O3 compiled). C is not faster than Java if you know Java but not C. The same is true for D.
>
> I really like the compiler diversity. What I miss (hint!) is a program to verify the compiler/backend correctness. Just generate a random D program, compile with all 3 compilers and compare the output. IMHO we could find a lot of backend bugs this way. This would help all D compilers.
>
> Regards,
> Kai

I think there are more involve in DMD in general, you need to count reviewers and all the infrastructure deployed. But even if only 2 of them are involved having them 100% focused on core D stuff would be a boon.

I see a trend with this discussion:

1. Compiler speed. This is a clear win for DMD, but at the same time LDC doesn't benefit from consistent investment on performance tuning. This obviously is just speculation, but I think the performance gap can be substantially closed with more resources invested here, at least for un-optimized builds.

2. Speed of compiled code. People often suggest that DMD could close the gap here, but I see this as wishful thinking, just listing all the optimizations LLVM does it's just depressing for anyone wanting to move DMD closer to that, it is just game over in this regard. Plus, who is going to work on them except Walter? Does anyone want to invest in
a dubious licensed backend?

But the story is more complicated :)

We are talking here about perception, LLVM is a well known and respectable backend, this is a win for people using or wanting to contribute to the language.

Also, people forget that DMD is limited on numbers of architecture supported.

My hope is that LDC will be on the same announcement page when a new DMD version is launched. When this will happen, common sense will just kill DMD.

Apreciate all the hard work you all guys do!
February 18, 2016
On Thursday, 18 February 2016 at 12:23:18 UTC, Jonathan M Davis wrote:
> if the dmd backend were dropped, […] that would further slow down the development of the frontend and not necessarily improve things overall.

How would that be?

 — David
February 18, 2016
On Thursday, 18 February 2016 at 11:41:26 UTC, Kai Nacke wrote:
> LLVM has about 2.5 million code lines. I am anything than sure if it is easy to improve compilation speed.

I think you are a tad too pessimistic here. First, don't forget that there are some big LLVM customers for which low compile times are important too (remember all the buzz from when Clang first hit the scene?).

Second, when was the last time you focussed on optimizing LDC -O0 compiler performance? There is currently a lot of low-hanging fruit, and then there are still the more involved options (such as making sure we use FastISel as much as possible).

It might not end up quite as fast as DMD is right now. But imagine that Walter would have invested all the time he spent e.g. on implementing DWARF EH into optimizing the LDC frontend/glue layer/backend pass structure instead. Who knows, we might have an LDC-based compiler today that is faster than the DMD we currently have.

 — David
February 18, 2016
On Wednesday, 17 February 2016 at 22:57:20 UTC, Márcio Martins wrote:
> […]

On a completely unrelated note, you aren't by any chance the Márcio Martins who is giving a talk at ETH in a couple of days, are you?

 — David
February 18, 2016
On Thursday, 18 February 2016 at 15:36:42 UTC, Jonathan M Davis wrote:
> On Thursday, 18 February 2016 at 14:23:12 UTC, Márcio Martins wrote:
>> I agree, but I don't see why this would have to change. It shouldn't change. Frontend development could happen on DMD as the *reference* compiler.
>
> And what exactly is the difference between the "official" compiler and the "reference" compiler supposed to be?

A reference implementation is written to the spec in the simplest and clearest possible way so that it is bug free... It is not for production...