July 21, 2020
On Tuesday, 21 July 2020 at 11:37:16 UTC, Atila Neves wrote:
> On Monday, 20 July 2020 at 12:58:39 UTC, Petar Kirov [ZombineDev] wrote:
>
>> [...]
>
> That is my dream for D. If the compiler *is* the build system, then sure, parallelise the compiler. Currently, I don't see the point of even trying.

Is it known how much parallelism is practically available now?  Would an SDC like architecture cleanly enable significantly more?

July 21, 2020
On Tuesday, 21 July 2020 at 15:22:44 UTC, Bruce Carneal wrote:
> On Tuesday, 21 July 2020 at 11:37:16 UTC, Atila Neves wrote:
>> On Monday, 20 July 2020 at 12:58:39 UTC, Petar Kirov [ZombineDev] wrote:
>>
>>> [...]
>>
>> That is my dream for D. If the compiler *is* the build system, then sure, parallelise the compiler. Currently, I don't see the point of even trying.
>
> Is it known how much parallelism is practically available now?  Would an SDC like architecture cleanly enable significantly more?

It's a good question.
The answer is yes.

Not just in a performance sense but also in a maintainability sense.
Because then we can see dependency issues clearer.
July 22, 2020
On Tuesday, 21 July 2020 at 13:29:55 UTC, Petar Kirov [ZombineDev] wrote:
> On Tuesday, 21 July 2020 at 11:37:16 UTC, Atila Neves wrote:
>> [...]
>
> In one of the web technologies we use at work, the compiler is used as a library by the build system to build a dependency graph (based on the imports) of all code and non-code assets. Then there is a declarative way to describe the transformations (compilation, minification, media encoding, etc.) that need to be done on each part of the project. The linking step (like in C/C++) is implicit - it's like you invoke the linker which works in reverse to figure out that in order to link dependencies in the form of libraries A and B it needs to first compile them with compilers X and Y.

That sounds amazing.
1 2
Next ›   Last »