On 4 February 2014 10:19, Adam Wilson <flyboynw@gmail.com> wrote:
On Mon, 03 Feb 2014 15:26:22 -0800, Frustrated <c1514843@drdrb.com> wrote:

On Monday, 3 February 2014 at 21:42:59 UTC, Shammah Chancellor
wrote:

You can always force the GC to run between cycles in your game, and
turn off automatic sweeps.  This is how most games operate nowadays.
It's also probably possible to create a drop-in replacement for the GC
to do something else.   I could see if being *VERY* useful to make the
GC take a compile-time parameter to select which GC engine is used.


This is just non-sense. Maybe this is why modern games suck then?
How do you guarantee that the GC won't kick in at the most
inopportune times? Oh, you manually trigger it? When? Right at
the moment when the player is about to down the boss after a 45
min fight?

Oh, right.. you just segfault cause there is no memory left.

On Monday, 3 February 2014 at 22:51:50 UTC, Frank Bauer wrote:

I'm not quite sure that I understand what you mean by GC avoidance being a major focus of 2014 though. In the long term, can I look forward to writing an absolutely, like in 100 %, like in guaranteed, GC free D app with all of current D's and Phobos' features if I choose to? Or does it mean: well for the most part it will avoid the GC, but if you're unlucky the GC might still kick in if you don't pay attention and when you least expect it?

It's either got to be 100% or nothing. The only issue of the GC
is the non-determinism.... or if you do corner it and trigger it
manually you end up with exactly the types of problems Mr.
Chancellor thinks doesn't exist... i.e., the longer you have to
put off the GC the worse it becomes(the more time it takes to run
or the less memory you have to work with).


Why is this myth of non-determinism still alive? The only truly non-deterministic GC's are concurrent collectors, but alas concurrent collects don't routinely stop-the-world either, so there really aren't any pauses to complain about. D's Mark-Sweep GC is *perfectly* deterministic. It can  *only* pause on allocation. Ergo you can determine exactly which allocation caused the problem. You might not expect the function you called to GC-allocate, but that doesn't make it non-deterministic, just not what you expected. Please, stop blaming your missed expectations on the GC. This non-determinism thing is a red herring that is repeated over and over by people who obviously have no idea what they are talking about.

Your assertion makes the assumption that people who write huge complex programs have completely control over their code, and/or depend on zero libraries, which is a ridiculous notion.
I'm quite sick of people making claims like that. If the default is to do something incompatible with my use case, and I depend on any libraries (including phobos), then it's safe to say, I'm running code that is incompatible with my use case.
What are my options then?

Additionally, in D, you don't have to type 'new' to allocate memory, it happens all the time... a closure here, a concatenate there. These are core language features, and to say that I'm meant to avoid them in my million loc program, authored by perhaps hundreds of programmers, because I need to be certain where every alloc is being issued from, is quite unrealistic to say the least.