Jump to page: 1 2
Thread overview
The easiest way to compete with Rust and cure D's GC reputation: switch to ARC.
Jul 10
IchorDev
Jul 10
Serg Gini
Jul 10
Kapendev
Jul 10
Dukc
Jul 12
Dukc
Jul 12
Dom DiSc
Jul 13
Dom DiSc
Jul 14
Luna
July 09

Technically, I actually mentioned this idea twice before in my comments elsewhere tangentially, in a more muted and more easily missed form, but I think it is one of the most compelling potential ways to make D (and indeed any/all other aspiring easy to use and expressive systems programming languages) compete better and to be better engineered for the kinds of niches people would consider D for.

In particular, it seems to me that by far the easiest way to cure D of its GC-dependent reputation (which is one of the biggest reasons why people avoid using D) is to switch the whole standard language and library to use ARC based memory management instead. This would result in memory and performance characteristics that are on par with well-made modern C++ code and yet still just as easy (or at least almost as easy) to manage as GC-based code.

This would require far less pervasive and tedious change to D than attempting to compete directly with Rust (or similar models) would and yet would yield most of the same benefit.

Basically, ARC (automatic reference counting) means deterministic reference counting that is strictly based on scope-based destructors being automatically called at all exit points while simultaneously ensuring that the memory allocation and deallocation scheme always takes the same amount of time and is as predictable as possible (within the constraints of modern hardware and operating systems, of course) such as by using a pool-like system (i.e. homogeneously allocated into separate memory structures per type size) or whatever else works.

The problem with GC in terms of real applications (not to be confused with people just parroting "GC is bad" on the internet without actually understanding it) has never really been the (perhaps higher than GC in total aggregate) computational overhead of it typically (except in very constrained contexts like some embedded systems, but then you'd be doing things manually probably anyway). Rather, the real problem with GC has always been its unpredictability and indeterminable execution time.

Having memory collection run at some unknown time in the future and with an unknown duration is a recipe for rendering any language or ecosystem that uses that GC unusable for hard realtime applications such as safety critical medical devices (where even a slight pause could cause an incorrect cut during surgery, etc) or video games that want/need to reduce unpredictable stuttering (such as to reduce the chance of epileptic seizures induced by frame stuttering or just to make a more pleasant experience). D's standard library and ecosystem being based on GC infectiously causes most of that ecosystem to thus be unusable.

If ARC is implemented seamlessly enough though then downstream code could be mostly uneffected. The API could be kept the same and GC calls could spit out no-op warnings.

At the same time, changing the D website to advertise that D has the same performance characteristics as smart object based modern C++ code and stopping advertising the language as being garbage collected could potentially bring many users back that have previously eliminated the language from consideration due to the presence of the GC.

D's most similarly capable competitor Nim has already implemented this via its "ARC and ORC" system. Nim defaults to ORC, which is only partially deterministic for the sake of cycle elimination, but it can easily switch to ARC on command and the point still stands.

Personally I don't much care about cycles since they can be manually engineered around on a case by case basis and true determinism is a worth-while cost for that I think.

More broadly, and as I mentioned in my comments elsewhere, it surprises me that more languages haven't realized that this middle road exists and is potentially the most effective overall "best of both worlds" for both ease and memory safety.

Combining this switching to ARC with ensuring D's real world library ecosystem is actually usable and proven with examples of full scale complete software seems like the best thing D could do to regain relevance and credibility in the broader programming world.


Unrelatedly and briefly, regarding my own D language status, as of reading the very uninspiring and rhetorically disingenuous "it's just a tool" and "any moral objection regarding basic human rights is just politics" response to my concerns regarding the D team's stance on automated plagiarism and digital personhood violation, I will most likely be switching to another language, community, and ecosystem that seems less ideologically and morally risky (not Nim though, by the way, since they have sadly embraced such unconscionable "tools" even more enthusiastically). Of course, such immorality has been engineered to be "everywhere" now, seeing as the companies behind it have billions riding on it (etc), but there are still degrees of its prevalence and communities where it has less reach and less influence/control and I still am bound to my moral duty to act on that. I will check back later though, to see if things change (i.e. both regarding moral backbone and the D language and ecosystem itself). In any case, I may pop by this thread though again, or not, we'll see. I still think this (ARC) is a critical idea and so wanted to mention it for the greater good as a tentative pseudo-"parting" gift. Let's see if you're capable of change, basically, and likewise (to be fair) to myself (especially regarding my admittedly very bad language dilettante (un)productivity habits and such). In any case, have a great day/night/week everyone!

July 10
>

The problem with GC in terms of real applications

What do you mean by real applications? Perhaps you meant realtime?

On Wednesday, 9 July 2025 at 17:10:26 UTC, WraithGlade wrote:

>

very uninspiring and rhetorically disingenuous "it's just a tool" and "any moral objection regarding basic human rights is just politics" response to my concerns

I didn't read that 2nd quote made by anyone anywhere.

Jordan

July 10

On Wednesday, 9 July 2025 at 17:10:26 UTC, WraithGlade wrote:

>

[...]

immutable and other significant parts of the language's design have married us to having a GC in some form. ARC is just another kind of GC, but if you ask me there's no tangible unanimous benefit to using it over the current GC design, it's a matter of context and preference. The new WIP Phobos 3 has had a big focus on reducing internal GC usage. And since ARC can already be implemented at the library level, it is not strictly necessary to have ARC as a language feature in order to use it already.

July 10

On Wednesday, 9 July 2025 at 17:10:26 UTC, WraithGlade wrote:

>

...

General note:
Does this text generated with the help of AI?

Topic note:
ARC maybe cool yeah.. but did it help Nim? Nope
Did it make Swift fast language? Hell no..

So why bother then?

July 10

On Wednesday, 9 July 2025 at 17:10:26 UTC, WraithGlade wrote:

>

Having memory collection run at some unknown time in the future and with an unknown duration is a recipe for rendering any language or ecosystem that uses that GC unusable for hard realtime applications such as safety critical medical devices (where even a slight pause could cause an incorrect cut during surgery, etc) or video games that want/need to reduce unpredictable stuttering (such as to reduce the chance of epileptic seizures induced by frame stuttering or just to make a more pleasant experience). D's standard library and ecosystem being based on GC infectiously causes most of that ecosystem to thus be unusable.

D already allows GC-free code and given how many successful indie games are made with C#, it's hard to argue that GC isn't a good fit for that. Can you also not disable/enable stuff at runtime?

July 10

On Wednesday, 9 July 2025 at 17:10:26 UTC, WraithGlade wrote:

>

In particular, it seems to me that by far the easiest way to cure D of its GC-dependent reputation (which is one of the biggest reasons why people avoid using D) is to switch the whole standard language and library to use ARC based memory management instead.

I don't think this is a good idea.

Because ARC suffers from the cyclic reference leaks that would have to be manually worked around, it'd mean that D programs would more or less have to be designed like Rust programs. With a tracing GC, we can almost always allocate in fire-and-forget style. While that'd still be the case most of the time with ARC, it is going to happen that we accidentally cause memory leaks with circular references, and then spend a lot of time figuring out the problem and then designing around it.

Second, it means all pointer assignments will have to be write-gated, to increment and decrement the reference counter. Not a big issue for most programs, but a minus nonetheless.

If it was just about the second issue, it might well be still worth it. Likewise, if the GC was mandatory and therefore made the most demanding real-time programming impossible, but it isn't. Considering how rarely you have to go totally without the GC, I think the first issue is far bigger a problem than the unpredictability of the tracing GC.

Then again, the last time I heard from Ali Weka.IO doesn't use the GC at all in their superperforming filesystem. Maybe they disagree with my assesment and if they do, I might well change my mind.

July 12
As a topic supporting ARC style memory management has long since been discussed. This isn't a new idea.

For ARC at the language level to exist, it requires that all memory in D would have to have a predictable state access. For example with a slice, not only do you need a pointer, length, but also the state object.

The way languages like Objective-C handle these issues, is to box raw types.

D has been designed foundationally to be based upon C. Our identifiers are defined against C23 (in process of upgrading from C99). The bitfields we just added? C's ABI.

C does not have ARC, its structs do not have methods, let alone state for this. Our struct layout matches C's.

This is of course before we even get to C++ stuff.

In otherwords, trying to transition the entire language over to ARC would break every D code base in existence, and on top of that remove aspects of D that people use with fervor.

GC algorithms like we have, are meant for memory that you do not care when cleanup occurs for it. You should not be using it for deterministic things.

For deterministic memory I recommend reference counting (including ownership transfer variation), or letting the call stack own the destruction ``scope(exit)``.

In hard realtime systems, any allocation during hot times means you've failed. It isn't deterministic intrinsically. You allocate upfront, and reuse memory in a way that can be predicted. But this requires you to understand your memory patterns which is a lot harder than it first seems. ARC doesn't change this. If you need ARC for these hot times, you've failed.

The ownership transfer wrappers that C++ has, are replicated in D.
https://dlang.org/phobos/std_typecons.html#Unique
Others examples of RC-like exist, including ones made by myself.

The above covers why we have not changed the entire language over to ARC. The D community values picking and choosing memory management strategies appropriate for the problem domain, not ideology.

So why don't we have RC in the language today, and have to work around it with copy constructors? Simply because we can't make them safe. Walter won't let it go in language without borrowing and we can't have that without a way to do the data flow analysis, which we can't have by default because it is slow and generates too many false positives.

Which is why I've been working on a fast data flow analysis engine that the community will hopefully accept it being on by default by preventing it reporting false positives. It won't of course make everyone happy so we'll also need a slow DFA that includes all the fun and more advanced analysis with the false positives.

Ownership transfer is a derivative of reference counting, specifically I would like us to have isolated from Midori. But again that needs a DFA.
July 12

On Friday, 11 July 2025 at 20:18:59 UTC, Richard (Rikki) Andrew Cattermole wrote:

>

As a topic supporting ARC style memory management has long since been discussed. This isn't a new idea.

For ARC at the language level to exist, it requires that all memory in D would have to have a predictable state access. For example with a slice, not only do you need a pointer, length, but also the state object.

Well not quite.

The reference count for ARC allocations would presumably exist in the runtime registry, not in the allocated object itself. Just like runtime type information does for garbage collected allocations.

Storing the ref count along the object would likely perform better, but don't think it would be done like that in D because the same reference types are used to deal with objects allocated in C.

July 12

On Saturday, 12 July 2025 at 21:36:18 UTC, Dukc wrote:

>

Storing the ref count along the object would likely perform better, but don't think it would be done like that in D because the same reference types are used to deal with objects allocated in C.

And a RefCounter separate of the object is necessary for const objects (as it need to remain mutable).

July 13
On 13/07/2025 11:20 AM, Dom DiSc wrote:
> On Saturday, 12 July 2025 at 21:36:18 UTC, Dukc wrote:
>> Storing the ref count along the object would likely perform better, but don't think it would be done like that in D because the same reference types are used to deal with objects allocated in C.
> 
> And a RefCounter separate of the object is necessary for const objects (as it need to remain mutable).

Mutability in language doesn't matter for this kind of thing.

For internal details like this you can break const, immutable and shared, they are a preventative measure for the human, not for the runtime.

What matters is if something is in read only memory, as long as you consider such memory as having an infinite lifetime then it is ok. Remove RC state and it won't effect anything.

This comes up for classes, they can't be in ROM due to the monitor needing to be mutable. If it could be nullified out, then it wouldn't be an issue.
« First   ‹ Prev
1 2