February 03, 2014
 ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb.  I bet they never tried to use a GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs when I allocate, oh what a fool I've been, please castigate me harder!

 Adam pls tell me more of this c# and that amazing gc it sounds so good




On Monday, 3 February 2014 at 21:42:59 UTC, Shammah Chancellor wrote:
> On 2014-02-01 07:35:44 +0000, Manu said:
>
>> On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com> wrote:
>> On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:
>> 
>> On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:
>> 
>>  std.typecons.RefCounted!T
>> 
>> core.memory.GC.disable();
>> 
>> 
>> Wow. That was easy.
>> 
>> I see, D's claim of being a multi-paradigm language is not false.
>> 
>> 
>> It's not a realistic suggestion. Everything you want to link uses the GC,
>> and the language its self also uses the GC. Unless you write software in
>> complete isolation and forego many valuable features, it's not a solution.
>> 
>> 
>>  Phobos does rely on the GC to some extent. Most algorithms and ranges do
>> not though.
>> 
>> 
>> Running (library) code that was written with GC in mind and turning GC off
>> doesn't sound ideal.
>> 
>> But maybe this allows me to familiarise myself more with D. Who knows,
>> maybe I can learn to stop worrying and love garbage collection.
>> 
>> Thanks for your help!
>> 
>> 
>> I've been trying to learn to love the GC for as long as I've been around
>> here. I really wanted to break that mental barrier, but it hasn't happened.
>> In fact, I am more than ever convinced that the GC won't do. My current #1
>> wishlist item for D is the ability to use a reference counted collector in
>> place of the built-in GC.
>> You're not alone :)
>> 
>> I write realtime and memory-constrained software (console games), and for
>> me, I think the biggest issue that can never be solved is the
>> non-deterministic nature of the collect cycles, and the unknowable memory
>> footprint of the application. You can't make any guarantees or predictions
>> about the GC, which is fundamentally incompatible with realtime software.
>> Language-level ARC would probably do quite nicely for the miscellaneous
>> allocations. Obviously, bulk allocations are still usually best handled in
>> a context sensitive manner; ie, regions/pools/freelists/whatever, but the
>> convenience of the GC paradigm does offer some interesting and massively
>> time-saving features to D.
>> Everyone will always refer you to RefCounted, which mangles your types and
>> pollutes your code, but aside from that, for ARC to be useful, it needs to
>> be supported at the language-level, such that the language/optimiser is
>> able to optimise out redundant incref/decref calls, and also that it is
>> compatible with immutable (you can't manage a refcount if the object is
>> immutable).
>> 
>> The problem isn't GC's per se. But D's horribly naive implementation, games are written on GC languages now all the time (Unity/.NET). And let's be honest, games are kind of a speciality, games do things most programs will never do.
>> 
>> You might want to read the GC Handbook. GC's aren't bad, but most, like the D GC, are just to simplistic for common usage today.
>> 
>> Maybe a sufficiently advanced GC could address the performance non-determinism to an acceptable level, but you're still left with the memory non-determinism, and the conundrum that when your heap approaches full (which is _always_ on a games console), the GC has to work harder and harder, and more often to try and keep the tiny little bit of overhead available.
>> A GC heap by nature expects you to have lots of memory, and also lots of FREE memory.
>> 
>> No serious console game I'm aware of has ever been written in a language with a GC. Casual games, or games that don't attempt to raise the bar may get away with it, but that's not the industry I work in.
>
> You can always force the GC to run between cycles in your game, and
> turn off automatic sweeps.  This is how most games operate nowadays.
> It's also probably possible to create a drop-in replacement for the GC
> to do something else.   I could see if being *VERY* useful to make the
> GC take a compile-time parameter to select which GC engine is used.

February 03, 2014
On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:
>  ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb.  I bet they never tried to use a GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs when I allocate, oh what a fool I've been, please castigate me harder!
>
>  Adam pls tell me more of this c# and that amazing gc it sounds so good

+1
February 03, 2014
 Any system that forces a single way of handling memory as the only viable method, be it GC( as D currently does), or ARC, is undesirable to me

  Rust seems to have found a very nice model, and even cpp with value/unique/rc/weak is IMO superior to what D currently offers

On Monday, 3 February 2014 at 22:51:50 UTC, Frank Bauer wrote:
> Andrei Alexandrescu wrote:
>
>> 2. Work on Phobos to see what can be done about avoiding unnecessary allocation. Most likely we'll need to also add a @nogc flag.
>>
>> ...
>>
>> 4. Work on the core language and druntime to see how to seamlessly accommodate alternate GC mechanisms such as reference counting.
>>
>> ...
>>
>> I thought I made it clear that GC avoidance (which includes considering built-in reference counting) is a major focus of 2014.
>>
>> Andrei
>
> My apologies then. Not apologizing for stressing this point over and over again, though :)
>
> Just to steal some people's thunder: I want the GC to remain in D as it is. This is not hyperbole or cynicism. I mean it. Having the GC around to clean up after me for noncritical apps is a blessing. Adding ARC and owning pointers would make my (and others) day though.
>
> BTW: Why is most everybody who doesn't like GC obsessed with ARC pointers? I like them for where they are appropriate. But what about a simpler owning pointer that frees automatically when it goes out of scope. It comes at exactly zero runtime cost, just as a "dumb" pointer. You don't have to drag a reference count around with you. It is just a pointer in memory. All its semantics are enforced at compile time. Throw borrowed pointers in the mix, i.e. references to owning pointers that become "frozen" while there are outstanding references to them, and you're all set.
>
> I think this would really be necessary (and sufficient!) to get Rust off your back and whatever half baked language stuff MS is intentionally leaking these days. At least IMHO and others from the *vocal minority*.
>
> I'm not quite sure that I understand what you mean by GC avoidance being a major focus of 2014 though. In the long term, can I look forward to writing an absolutely, like in 100 %, like in guaranteed, GC free D app with all of current D's and Phobos' features if I choose to? Or does it mean: well for the most part it will avoid the GC, but if you're unlucky the GC might still kick in if you don't pay attention and when you least expect it?

February 03, 2014
An owning pointer is the perfect garbage collector: it gets rid of the garbage exactly at the moment it goes out of scope.
February 03, 2014
On Monday, 3 February 2014 at 21:42:59 UTC, Shammah Chancellor
wrote:
>
> You can always force the GC to run between cycles in your game, and
> turn off automatic sweeps.  This is how most games operate nowadays.
> It's also probably possible to create a drop-in replacement for the GC
> to do something else.   I could see if being *VERY* useful to make the
> GC take a compile-time parameter to select which GC engine is used.


This is just non-sense. Maybe this is why modern games suck then?
How do you guarantee that the GC won't kick in at the most
inopportune times? Oh, you manually trigger it? When? Right at
the moment when the player is about to down the boss after a 45
min fight?

Oh, right.. you just segfault cause there is no memory left.

On Monday, 3 February 2014 at 22:51:50 UTC, Frank Bauer wrote:

> I'm not quite sure that I understand what you mean by GC avoidance being a major focus of 2014 though. In the long term, can I look forward to writing an absolutely, like in 100 %, like in guaranteed, GC free D app with all of current D's and Phobos' features if I choose to? Or does it mean: well for the most part it will avoid the GC, but if you're unlucky the GC might still kick in if you don't pay attention and when you least expect it?

It's either got to be 100% or nothing. The only issue of the GC
is the non-determinism.... or if you do corner it and trigger it
manually you end up with exactly the types of problems Mr.
Chancellor thinks doesn't exist... i.e., the longer you have to
put off the GC the worse it becomes(the more time it takes to run
or the less memory you have to work with).

It might work ok with some concurrent AGC that you can use for
non-critical parts. e.g., have phobo's use the GC for non-real
time sections of the app(boot up, menu's for games, etc...) then
disable it and use ARC for when the app is meant for optimal
and/or deterministic performance.

One could argue that if one goes this around why not use ARC or
manual memory mangement(MMM?) the whole time... but by using the
GC during the non-critical parts of the program one can focus
less on memory leaks as usual with the GC.

What would be nice is to be able to write code that is oblivious
to how it's memory is managed, but be able to switch between
different methods.

[Sartup->menu's and other non-performance section of the game] <-
use GC
[real-time areas of game] <- use ARC or manual

February 04, 2014
On Monday, 3 February 2014 at 23:13:07 UTC, woh wrote:

>   Rust seems to have found a very nice model ...

How cool would that be: a language with the memory allocation features of Rust but with stronger syntax and semantics roots in the C / C++ world?
February 04, 2014
On Monday, 31 December 2012 at 12:14:22 UTC, Sven Over wrote:
> In my job I'm writing backend services that power a big web site. Perfomance is key, as the response time of the data service in most cases directly adds to the page load time. The bare possibility that the whole service pauses for, say, 100ms is making me feel very uncomfortable.

Back to the original scenario:

Current CPUs have a measured peak throughput of 2-4GB per 100ms. So if you keep pointers clustered on cache lines in a way that is compatible with the AGCs prefetching patterns, then you ought to make it for most applications if the collector is optimized?

If your backend service (data) is replicated on multiple servers you could, of course, let the servers run the GC at different times organized with a load-balancer/router/task scheduler.

So if GC makes you more productive on a web server then I don't really see why you don't use it.

Now, that might not work out in the future though. If CPU caches keep growing to the extent that most of your working dataset sits in the cache and you have lots of rather slow main memory then the AGC process will become sloooow no matter how good the GC is.
February 04, 2014
On Tuesday, 4 February 2014 at 00:08:28 UTC, Frank Bauer wrote:
> How cool would that be: a language with the memory allocation features of Rust but with stronger syntax and semantics roots in the C / C++ world?

It is really tempting to write a new parser for Rust, isn't it? :-)

February 04, 2014
On Mon, 03 Feb 2014 15:26:22 -0800, Frustrated <c1514843@drdrb.com> wrote:

> On Monday, 3 February 2014 at 21:42:59 UTC, Shammah Chancellor
> wrote:
>>
>> You can always force the GC to run between cycles in your game, and
>> turn off automatic sweeps.  This is how most games operate nowadays.
>> It's also probably possible to create a drop-in replacement for the GC
>> to do something else.   I could see if being *VERY* useful to make the
>> GC take a compile-time parameter to select which GC engine is used.
>
>
> This is just non-sense. Maybe this is why modern games suck then?
> How do you guarantee that the GC won't kick in at the most
> inopportune times? Oh, you manually trigger it? When? Right at
> the moment when the player is about to down the boss after a 45
> min fight?
>
> Oh, right.. you just segfault cause there is no memory left.
>
> On Monday, 3 February 2014 at 22:51:50 UTC, Frank Bauer wrote:
>
>> I'm not quite sure that I understand what you mean by GC avoidance being a major focus of 2014 though. In the long term, can I look forward to writing an absolutely, like in 100 %, like in guaranteed, GC free D app with all of current D's and Phobos' features if I choose to? Or does it mean: well for the most part it will avoid the GC, but if you're unlucky the GC might still kick in if you don't pay attention and when you least expect it?
>
> It's either got to be 100% or nothing. The only issue of the GC
> is the non-determinism.... or if you do corner it and trigger it
> manually you end up with exactly the types of problems Mr.
> Chancellor thinks doesn't exist... i.e., the longer you have to
> put off the GC the worse it becomes(the more time it takes to run
> or the less memory you have to work with).
>

Why is this myth of non-determinism still alive? The only truly non-deterministic GC's are concurrent collectors, but alas concurrent collects don't routinely stop-the-world either, so there really aren't any pauses to complain about. D's Mark-Sweep GC is *perfectly* deterministic. It can  *only* pause on allocation. Ergo you can determine exactly which allocation caused the problem. You might not expect the function you called to GC-allocate, but that doesn't make it non-deterministic, just not what you expected. Please, stop blaming your missed expectations on the GC. This non-determinism thing is a red herring that is repeated over and over by people who obviously have no idea what they are talking about.

> It might work ok with some concurrent AGC that you can use for
> non-critical parts. e.g., have phobo's use the GC for non-real
> time sections of the app(boot up, menu's for games, etc...) then
> disable it and use ARC for when the app is meant for optimal
> and/or deterministic performance.
>
> One could argue that if one goes this around why not use ARC or
> manual memory mangement(MMM?) the whole time... but by using the
> GC during the non-critical parts of the program one can focus
> less on memory leaks as usual with the GC.
>
> What would be nice is to be able to write code that is oblivious
> to how it's memory is managed, but be able to switch between
> different methods.
>
> [Sartup->menu's and other non-performance section of the game] <-
> use GC
> [real-time areas of game] <- use ARC or manual
>


-- 
Adam Wilson
GitHub/IRC: LightBender
Aurora Project Coordinator
February 04, 2014
On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:
>  ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb.  I bet they never tried to use a GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs when I allocate, oh what a fool I've been, please castigate me harder!

Also people should consider that Apple (unlike C++ game devs) did not have a tradition of contempt for GC. In fact they tried GC *before* they switched to ARC. The pro-GC camp always likes to pretend that the anti-GC one is just ignorant, rejecting GC based on prejudice not experience but Apple rejected GC based on experience.

GCed Objective-C did not allow them to deliver the user experience they wanted (on mobile), because of the related latency issues. So they switched to automated ref counting. It is not in question that ref counting sacrifices throughput (compared to an advanced GC) but for interactive, user facing applications latency is much more important.

You can do soft-real time with GC as long as the GC is incremental (D's is not) and you heavily rely on object reuse. That is what I am doing with LuaJIT right now and the frame rates are nice and constant indeed. However, you pay a high price for that. Object reuse means writing additional code, makes things more complex and error-prone, which is why your average app developer does not do it.. and should not have to do it.

Apple had to come up with a solution which does not assume that the developers will be careful about allocations. The performance of the apps in the iOS app store are ultimately part of the user experience so ARC is the right solution because it means that your average iOS app written by Joe Coder will not have latency issues or at least less latency issues compared to any GC-based solution.

I think it is an interesting decision for the D development team to make. Do you want a language which can achieve low latency *if used carefully* or one which sacrifices maximal throughput performance for less latency issues in the common case.

I see no obvious answer to that. I have read D has recently been used for some server system at Facebook, ref counting usually degrades performance in that area. It is no coincidence that Java shines on the server as a high performance solution while Java is a synonym for dog slow memory hog on the desktop and mighty unpopular there because of that. The whole Java ecosystem from the VM to the libraries is optimized for enterprise server use cases, for throughput, scalability, and robustness, not for making responsive GUIs (and low latency in general) or for memory use.

If D wants to be the new Java GC is the way to go, but no heap allocation happy GCed language will ever challenge C/C++ on the desktop.

Which reminds me of another major company who paddled back on GC based on experience: Microsoft. Do you remember the talk back then .NET/C# were new? Microsoft totally wanted that to be the technology stack of the future "managed code" everywhere, C/C++ becoming "legacy". However, C# ended up being nothing more than Microsoft Java, shoveling enterprise CRUD in the server room. Microsoft is hosting "Going Native" conferences nowadays, declaring their present and future dedication to C++ (again) and they based the new WinRT on ref counting not GC.