December 13, 2022

On 12/13/22 9:45 PM, Ali Çehreli wrote:

>

On 12/13/22 18:05, H. S. Teoh wrote:

>

Hmm.  Whatever happened to that proposal for GC-less exceptions?
Something about allocating the exception from a static buffer and
freeing it in the catch block or something?

I have an errornogc module here:

  https://code.dlang.org/packages/alid

I hope it still compiles. :)

FYI, throwing actually uses the GC unless you override the traceinfo allocator. Yes, even if it's marked @nogc (functions marked @nogc can still call arbitrary C functions that might allocate using the GC).

-steve

December 14, 2022

On Tuesday, 13 December 2022 at 07:11:34 UTC, thebluepandabear wrote:

>

Does this claim have merit? I am not far enough into learning D, so I haven't touched GC stuff yet, but I am curious what the D community has to say about this issue.

I disagree with the majority opinion on this subject. I find D's GC to often be heavily oversold, when it is particularly not applicable to many portions of my use case (game development). It certainly solves certain problems, but it introduces new ones. The emphasis behind the GC mentality seems to be that most(all!) people will never encounter those for their purposes and so they should literally just not think about it and trust the GC, until you suddenly can't anymore and the whole thing breaks apart. Alternative strategies do exist obviously, but they're often shoved into the backroom, with the salespeople only leading the customers to them after much grumbling and fumbling with their keys.

How do you instantiate a class object in D using the GC?

new Foo;

How do you instantiate one using malloc? Something like:

import core.memory;
import core.stdc.stdlib : malloc, free;
import core.lifetime : emplace;
T NEW(T, Args...)(auto ref Args args) /*@nogc (nope!)*/ if (is(T == class)) {
	enum size = __traits(classInstanceSize, T);
	void* mem = malloc(size);
	scope(failure) free(mem);
	//throw OOMError.get("Out of Memory in NEW!"~T.stringof); // wanna GC-allocate here? use a predefined object? or just ignore?
	return mem !is null ? emplace!T(mem[0..size], args) : null;
}
// and don't forget
void FREE(T)(ref T obj) /*@nogc*/ if (is(T == class)) {
	if (obj is null) return;
	auto mem = cast(void*) obj;
	//debug if (!GC.inFinalizer && GC.addrOf(mem)) return;
	scope(exit) free(mem);
	destroy(obj);
	obj = null;
}

To people who are experienced with D, that's par for the course. And people who have already done a good deal of thinking about memory management in performance-intensive scenarios will understand the need to know their language's alternatives to begin with. But showing that to people coming to D as the alternative to what you're supposed to think is the right way everyone should use is just not attractive. It's a contradiction in one of D's core philosophies, IMO: "Solve basic problems and prevent easy bugs that most people walk into without thinking", which sounds like an admirable goal aimed at drawing in and protecting new users. Except then they're given a tool that will just create problems if used in the intended way (not thinking about it) if they get into certain domains of development. The explanation of "Well, obviously you need to think about it if you're going to be doing THAT!" just doesn't mesh with the way it's initially sold.

Pipe dream: Why not new malloc Foo;? (or "deterministic" or something. and then, new rc Foo;!) What if, to prevent accidental intermingling, it were a storage class? malloc Foo mfoo = new Foo; // Error!. Just thinking out loud. Part of this can already be done by wrapping everything in structs and templates. But just more noise!

That said, regarding your specific question, there are numerous parts of the D standard library you can safely use without allocating with the GC (and I do, and still love it) and non-allocating alternatives are often added (e.g. .join vs .joiner). Though the problem exists many components can't be explicitly @nogc (a caveat I find it just not worth it to worry about anymore- it takes up too much time and effort that could be better spent on the code itself than on solving a trillion compiler errors trying to satisfy every possible aspect and edge case of @nogc-dom).

There is a lot of code you can write in D that, without going over the stdlib with a fine-toothed comb, you can be reasonably sure will probably never GC-allocate, even if it's not explicitly @nogc, if that's an acceptable tradeoff for the code safety requirements in your use case. std.container.array is good, as previously mentioned. You can build on this and make malloc/ref-counted variations of hashmaps/associative arrays too (if you want to spend the effort on it). I believe there are some third-party libraries up on dub for that already. The operator overloading and syntactic sugar is good enough that everything can look "just like" native D runtime/GC constructs if you want, except for some declarations (but all that is not exactly "out of the box", if we're still thinking of the prospective new user context here).

tl;dr: I don't hate the GC. It's great for one&dones. I just wish it wasn't so heavily lauded as The Truth & The Way.

December 14, 2022

On Wednesday, 14 December 2022 at 08:01:47 UTC, cc wrote:

>

I disagree with the majority opinion on this subject.
How do you instantiate a class object in D using the GC?

new Foo;

How do you instantiate one using malloc? Something like:

import core.memory;
import core.stdc.stdlib : malloc, free;
import core.lifetime : emplace;
T NEW(T, Args...)(auto ref Args args) /*@nogc (nope!)*/ if (is(T == class)) {
	enum size = __traits(classInstanceSize, T);
	void* mem = malloc(size);
	scope(failure) free(mem);
	//throw OOMError.get("Out of Memory in NEW!"~T.stringof); // wanna GC-allocate here? use a predefined object? or just ignore?
	return mem !is null ? emplace!T(mem[0..size], args) : null;
}
// and don't forget
void FREE(T)(ref T obj) /*@nogc*/ if (is(T == class)) {
	if (obj is null) return;
	auto mem = cast(void*) obj;
	//debug if (!GC.inFinalizer && GC.addrOf(mem)) return;
	scope(exit) free(mem);
	destroy(obj);
	obj = null;
}

Yes, but be aware that this kind of stuff is, what you would also need to do in C++ to make it more safe - but nobody does it because it's so awful.
And in D you almost never need this, because it is sufficient to turn off the GC only in your performance critical loops.
So you get the same performance and the same (or better) memory safety with only a tiny part of the hassle.

December 14, 2022
On Wednesday, 14 December 2022 at 09:03:58 UTC, Dom DiSc wrote:
>
> ....
> And in D you almost never need this, because it is sufficient to turn off the GC only in your performance critical loops.
> So you get the same performance and the same (or better) memory safety with only a tiny part of the hassle.

Yes, people just need to take the GC chill pill.

https://dlang.org/blog/2017/06/16/life-in-the-fast-lane/

In the not-too-distant future, manual memory management will be outlawed.

December 14, 2022
On Wednesday, 14 December 2022 at 08:01:47 UTC, cc wrote:
> ...
> I disagree with the majority opinion on this subject.  I find D's GC to often be heavily oversold, when it is particularly not applicable to many portions of my use case (game development)...
> ...

I use to write games (Personal Projects), and my main language believe or not is still the old C for most of the time, I already have my lib and my way of doing it so, no big deal for me!

But when I tried D for this same thing, I used to do the basic thing like, enable GC load and instantiate everything for the level, disable GC and run the game, and just enable again after the level is over, and I don't remember having much trouble.

For example missiles/projectiles will be added to a pre-allocated space as they are being fired, but what I see sometimes (In others people code) is they are allocating these things in real-time, which I don't like it.

I don't know if this is the problem some people have with GC, allocating in real-time, but I'd avoid it, unless game development changed a lot and people like to do allocation in real time (By the way I'm not say you're doing this).

Matheus.
December 14, 2022

On Wednesday, 14 December 2022 at 10:06:40 UTC, matheus wrote:

>

But when I tried D for this same thing, I used to do the basic thing like, enable GC load and instantiate everything for the level, disable GC and run the game, and just enable again after the level is over, and I don't remember having much trouble.

Pre-allocated lists are fine for many cases. We typically use them for particle engines now, when we can be comfortable with a specific hard limit and the initial resource draw isn't a significant burden. But in some of our engines we decided we wanted to be able to scale without entity limits, especially for persistent always-online worlds. Naive porting from reference counted languages to D's GC led to unacceptable resource usage and performance losses that GC.disable couldn't work around. It doesn't matter if the actual delay caused by a GC pause is minimal, if those delays cause significant recurring hiccups and timing errors. Allocations and deallocations had to be more deterministic, ultimately it came down to the decision that with the need to offload or reschedule memory management to be more distributed, working around the GC would be more trouble than just avoiding it in the first place.

December 14, 2022

On Wednesday, 14 December 2022 at 11:12:52 UTC, cc wrote:

>

Pre-allocated lists are fine for many cases. We typically use them for particle engines now, when we can be comfortable with a specific hard limit and the initial resource draw isn't a significant burden.

Does your game available somewhere?

December 14, 2022
On Wednesday, 14 December 2022 at 09:27:54 UTC, areYouSureAboutThat wrote:
>
> Yes, people just need to take the GC chill pill.
>
> https://dlang.org/blog/2017/06/16/life-in-the-fast-lane/
>
> In the not-too-distant future, manual memory management will be outlawed.

I'm in GC rehab trying to distance myself from GC.

When you think about tracing GC is one of the most crazy algorithms in computer science. Still it is widely used because it covers all corner cases. However, the complexity to get it all to work is huge.
December 14, 2022

On Tuesday, 13 December 2022 at 07:11:34 UTC, thebluepandabear wrote:

>

Hello,

I was speaking to one of my friends on D language and he spoke about how he doesn't like D language due to the fact that its standard library is built on top of GC (garbage collection).

He said that if he doesn't want to implement GC he misses out on the standard library, which for him is a big disadvantage.

Does this claim have merit? I am not far enough into learning D, so I haven't touched GC stuff yet, but I am curious what the D community has to say about this issue.

It's more of a small ecosystem divide, that a hindrance (you can always do more restricted code).

The GC by itself can have a cost arbitrarily low, so it's not the problem.

What is a problem is secret use of the druntime when you wanted no druntime things going on, often for compatibility reasons. Say, WebASM or consoles. But if you have reasons to avoid the D runtime, or reasons to make a minimal D runtime, you should be expected nto to be able to use the stdlib! Else, why would the runtime be for?

December 14, 2022

On Wednesday, 14 December 2022 at 01:47:29 UTC, Steven Schveighoffer wrote:

>
void main() @nogc
{
   import std.conv;
   auto v = "42".to!int;
}

I have been wondering why there isn't a basic variation like this available:

auto i = "42".toOr!int(-1);
auto s = i.toOr!string(null);