On 1 February 2014 19:27, Adam Wilson <flyboynw@gmail.com> wrote:
On Fri, 31 Jan 2014 23:35:44 -0800, Manu <turkeyman@gmail.com> wrote:

On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com> wrote:

On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:

 On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:

  std.typecons.RefCounted!T


core.memory.GC.disable();


Wow. That was easy.

I see, D's claim of being a multi-paradigm language is not false.



It's not a realistic suggestion. Everything you want to link uses the GC,
and the language its self also uses the GC. Unless you write software in
complete isolation and forego many valuable features, it's not a solution.


 Phobos does rely on the GC to some extent. Most algorithms and ranges do

not though.


Running (library) code that was written with GC in mind and turning GC
off
doesn't sound ideal.

But maybe this allows me to familiarise myself more with D. Who knows,
maybe I can learn to stop worrying and love garbage collection.

Thanks for your help!


I've been trying to learn to love the GC for as long as I've been around
here. I really wanted to break that mental barrier, but it hasn't
happened.
In fact, I am more than ever convinced that the GC won't do. My current #1
wishlist item for D is the ability to use a reference counted collector in
place of the built-in GC.
You're not alone :)

I write realtime and memory-constrained software (console games), and for
me, I think the biggest issue that can never be solved is the
non-deterministic nature of the collect cycles, and the unknowable memory
footprint of the application. You can't make any guarantees or predictions
about the GC, which is fundamentally incompatible with realtime software.
Language-level ARC would probably do quite nicely for the miscellaneous
allocations. Obviously, bulk allocations are still usually best handled in
a context sensitive manner; ie, regions/pools/freelists/whatever, but the
convenience of the GC paradigm does offer some interesting and massively
time-saving features to D.
Everyone will always refer you to RefCounted, which mangles your types and
pollutes your code, but aside from that, for ARC to be useful, it needs to
be supported at the language-level, such that the language/optimiser is
able to optimise out redundant incref/decref calls, and also that it is
compatible with immutable (you can't manage a refcount if the object is
immutable).


The problem isn't GC's per se. But D's horribly naive implementation,
games are written on GC languages now all the time (Unity/.NET). And let's
be honest, games are kind of a speciality, games do things most programs
will never do.

You might want to read the GC Handbook. GC's aren't bad, but most, like
the D GC, are just to simplistic for common usage today.


Maybe a sufficiently advanced GC could address the performance
non-determinism to an acceptable level, but you're still left with the
memory non-determinism, and the conundrum that when your heap approaches
full (which is _always_ on a games console), the GC has to work harder and
harder, and more often to try and keep the tiny little bit of overhead
available.
A GC heap by nature expects you to have lots of memory, and also lots of
FREE memory.

No serious console game I'm aware of has ever been written in a language
with a GC. Casual games, or games that don't attempt to raise the bar may
get away with it, but that's not the industry I work in.

That's kind of my point. You're asking for massive changes throughout the entire compiler to support what is becoming more of an edge case, not less of one. For the vast majority of use cases, a GC is the right call and D has to cater to the majority if it wants to gain any significant mindshare at all. You don't grow by increasing specialization...

Why is ARC any worse than GC? Why is it even a compromise at the high level?
Major players have been moving away from GC to ARC in recent years. It's still a perfectly valid method of garbage collection, and it has the advantage that it's intrinsically real-time compatible.

I don't think realtime software is becoming an edge case by any means, maybe 'extreme' realtime is, but that doesn't change the fact that the GC still causes problems for all realtime software.

I personally believe latency and stuttering is one of the biggest usability hindrances in modern computing, and it will become a specific design focus in software of the future. People are becoming less and less tolerant of latency in all forms; just consider the success of iPhone compared to competition, almost entirely attributable to the silky smooth UI experience. It may also be a telling move that Apple switched to ARC around the same time, but I don't know the details.

I also firmly believe that if D - a native compiled language familiar to virtually all low-level programmers - doesn't have any ambition to service the embedded space in the future, what will? And why not?
The GC is the only thing inhibiting D from being a successful match in that context. ARC is far more appropriate, and will see it enter a lot more fields.
What's the loss?

I think it's also telling that newcomers constantly raise it as a massive concern, or even a deal-breaker. Would they feel the same about ARC? I seriously doubt it. I wonder if a poll is in order...
Conversely, would any of the new-comers who are pro-GC feel any less happy if it were ARC instead? $100 says they probably wouldn't even know, and almost certainly wouldn't care.