August 10, 2022
On Wed, Aug 10, 2022 at 09:33:27PM +0000, monkyyy via Digitalmars-d wrote: [...]
> If I look at some of the old std code the only explaination for why some of it is so terrible is that it was written before there were good d programmers
[...]

Don't forget also that D has changed a lot since its beginning. The language we have today is very different from the language back then when some of this code was originally written. Back then, a lot of D's powerful features did not exist yet, so what was considered good style back then is very different from what's considered good style today. A lot of things possible today were not possible back then, so the old code had to be written under different constraints, and could not have taken advantage of the advances in the language that we have today.

For example, the earliest versions of D did not have templates, static if's, CTFE, or compile-time introspection, which today are a large part of what defines D.  Can you imagine what kind of code you had to write back then, compared to what we can write today?


T

-- 
Winners never quit, quitters never win. But those who never quit AND never win are idiots.
August 10, 2022
On Wednesday, 10 August 2022 at 21:51:21 UTC, H. S. Teoh wrote:
> For example, the earliest versions of D did not have templates, static if's, CTFE, or compile-time introspection, which today are a large part of what defines D.  Can you imagine what kind of code you had to write back then, compared to what we can write today?

Considering im almost entirely here for templates the answer is simple; I wouldnt be here


August 10, 2022
On Wed, Aug 10, 2022 at 09:55:36PM +0000, monkyyy via Digitalmars-d wrote:
> On Wednesday, 10 August 2022 at 21:51:21 UTC, H. S. Teoh wrote:
> > For example, the earliest versions of D did not have templates, static if's, CTFE, or compile-time introspection, which today are a large part of what defines D.  Can you imagine what kind of code you had to write back then, compared to what we can write today?
> 
> Considering im almost entirely here for templates the answer is simple; I wouldnt be here

Haha, me too.  I heavily use templates, CTFE, static-if, and compile-time introspection (DbI rocks!).  Without these, I might as well crawl back to C++. Or swallow the bitter pill and go back to plain old C (as I have to at work).  I was about to say Java, but that pill would be too bitter to take even in the face of C++'s flaws and C's lack of safety, so no.  :-P


T

-- 
VI = Visual Irritation
August 11, 2022
On Monday, 8 August 2022 at 16:59:20 UTC, ryuukk_ wrote:
> On Monday, 8 August 2022 at 15:05:49 UTC, wjoe wrote:
> [...]
>
>> On Sunday, 7 August 2022 at 21:25:57 UTC, ryuukk_ wrote:
>>
>> I'm not on the anti-GC train, i use it myself in some of my projects, i find it very useful to have
>>

I don't understand where in my post I implied that you are on the anti GC train.
You brought up the discord thing to make a point that GC's have bottlenecks and allocators are the solution?
However, their solution to a GC bottleneck was not to fix it by using allocators but to dump their Go code in favor of a Rust port.
Neither can I see a connection of Go's GC's bottleneck to D's nor how allocators would fix it.
Go's GC would still run a collection every 2 minutes.
So you would either need a GC.disable to stop it from collecting where you can't suffer the impact, with a GC.collect to run it when it suits you, or you would need to completely disable the GC.
Or you port your entire code base to a non-GC language.

>> The point i am trying to make is D has the capabilities to provide a solution to both GC users and people whose performance constraints prohibit the use of a GC
>>
>> But for some reason, people in the community only focus on the GC, and disregard anything else, preventing me to properly advertise D as a pragmatic solution

Maybe because the GC has no disadvantage to them. But that's my own guess.
If you absolutely can't suffer a GC, there's -betterC, in case you didn't know.

I guess it's an ignorant point of view, but I don't see how someone whose constraints for writing high-performance, and/or real-time, and/or embedded/micro-architecture code, which prohibits the use of a GC, would find D-Runtime/Phobos meeting their requirements.
There's at least one lightweight implementation of D-Runtime.
And Phobos? Everything that allocates or is templated will likely be too slow/bloated in that case, thus a specialized solution that takes advantage of context seems to be necessary.

August 12, 2022
On 12/08/2022 1:20 AM, wjoe wrote:
> I guess it's an ignorant point of view, but I don't see how someone whose constraints for writing high-performance, and/or real-time, and/or embedded/micro-architecture code, which prohibits the use of a GC, would find D-Runtime/Phobos meeting their requirements.

There have been multiple users in this category of D who have successfully made production systems with the GC linked in.

All memory allocations are expensive and can fail, if you want performant safe code you have no choice but to prevent allocating memory.

It does not matter what language you use, if you allocate you slow your code down.
August 11, 2022
On Monday, 8 August 2022 at 17:25:11 UTC, Paul Backus wrote:
> On Monday, 8 August 2022 at 15:51:11 UTC, wjoe wrote:
>>
>> Yes, but more typing and it requires an import.
>> No intention to complain; just saying convenience and such. :)
>
> These days, new attributes are added to the `core.attribute` module rather than being available globally, so if the `@GC(...)` syntax were added, it would also require an import. :)

@GC(...) was not supposed to be an attribute, more akin to pragma but for the GC.
Literally reading: "At GC: Do not disturb", "At GC: collect now" (or for those who'd prefer a polite rather than a commanding tone: "At GC: Please don't disturb", "At GC: Would you kindly collect the garbage now.").

There are hints in the language and libraries which indicate a desire of programmers for a programming language to read or sound like a natural, spoken language. That's where the @(at) came from. :)

August 11, 2022
On Thursday, 11 August 2022 at 13:26:21 UTC, rikki cattermole wrote:
>
> On 12/08/2022 1:20 AM, wjoe wrote:
>> I guess it's an ignorant point of view, but I don't see how someone whose constraints for writing high-performance, and/or real-time, and/or embedded/micro-architecture code, which prohibits the use of a GC, would find D-Runtime/Phobos meeting their requirements.
>
> There have been multiple users in this category of D who have successfully made production systems with the GC linked in.

I was referring to the "someone who has a constraint that prohibits the use of a GC" part and my reasoning is that someone with such a use case probably wouldn't find solutions in *Phobos* meeting their demands because the result would be too bloated/slow. And as such they would probably tailor their own optimized solutions.
With no word did I claim that it's impossible to write production systems in D with a GC linked in.

> All memory allocations are expensive and can fail, if you want performant safe code you have no choice but to prevent allocating memory.
>
> It does not matter what language you use, if you allocate you slow your code down.

Thank you. That's what I'm saying - or trying to say, at least.
August 12, 2022
On 12/08/2022 1:54 AM, wjoe wrote:
>> All memory allocations are expensive and can fail, if you want performant safe code you have no choice but to prevent allocating memory.
>>
>> It does not matter what language you use, if you allocate you slow your code down.
> 
> Thank you. That's what I'm saying - or trying to say, at least.

Pretty much all programs have to allocate at some point during their lifecycle.

If you understand this, having a GC during points in it that don't matter about performance is fine and that is what people who use D in this sort of environment do.
August 11, 2022
On Thursday, 11 August 2022 at 13:26:21 UTC, rikki cattermole wrote:
>
> On 12/08/2022 1:20 AM, wjoe wrote:
>> I guess it's an ignorant point of view, but I don't see how someone whose constraints for writing high-performance, and/or real-time, and/or embedded/micro-architecture code, which prohibits the use of a GC, would find D-Runtime/Phobos meeting their requirements.
>
> There have been multiple users in this category of D who have successfully made production systems with the GC linked in.
>
> All memory allocations are expensive and can fail, if you want performant safe code you have no choice but to prevent allocating memory.
>
> It does not matter what language you use, if you allocate you slow your code down.

Yes. The important question is: does it matter?
August 11, 2022
On Thursday, 11 August 2022 at 13:20:13 UTC, wjoe wrote:
>
> I guess it's an ignorant point of view, but I don't see how someone whose constraints for writing high-performance, and/or real-time, and/or embedded/micro-architecture code, which prohibits the use of a GC, would find D-Runtime/Phobos meeting their requirements.
> There's at least one lightweight implementation of D-Runtime.
> And Phobos? Everything that allocates or is templated will likely be too slow/bloated in that case, thus a specialized solution that takes advantage of context seems to be necessary.

Yes, real time code today will avoid GC all together. Likely it will use custom everything, like specialized allocators, caches and container algorithms. All in order to avoid memory allocation from a heap as well as avoiding memory fragmentation.

Many times real time systems has a real time part but also a non real time part which often runs a rich OS like Linux. Services in the rich OS part can usually use GC without any problems.

In terms of computer games, correct me if I'm wrong but GC will become the norm there. The reason is that computer games are becoming more and more advanced and there is so much stuff going on that GC becomes a lesser problem compared to everything else in terms of performance. Probably if I came to a computer game technical manager in 20 years and said I wanted to use GC, he would probably kill me. Today if I said the same, he would say it's probably ok. I'm not that convinced by the Doom example as there seem to be a confusion between caches and GC. You are welcome to come with more examples of early computer games that used GC.