October 09, 2013
Am 09.10.2013 07:23, schrieb PauloPinto:
> Apple dropped the GC and went ARC instead, because they never
> managed to make it work properly.
>
> It was full of corner cases, and the application could crash if
> those cases were not fully taken care of.
>
> Or course the PR message is "We dropped GC because ARC is better"
> and not "We dropped GC because we failed".
>
> Now having said this, of course D needs a better GC as the
> current one doesn't fulfill the needs of potential users of the
> language.

the question is - could ARC be an option for automatic memory managment
in D - so that the compiler generated ARC code when not using gc - but using gc-needed code?

or is that a hard to reach goal due to gc-using+arc-using lib combine problems?
October 09, 2013
On Wednesday, 9 October 2013 at 03:39:38 UTC, Andrei Alexandrescu wrote:
> On 10/8/13 4:45 PM, Jonathan M Davis wrote:
>> On Wednesday, October 09, 2013 01:04:39 Tourist wrote:
>>> I thought about an alternative approach:
>>> Instead of using a (yet another) annotation, how about
>>> introducing a flag similar to -cov, which would output lines in
>>> which the GC is used.
>>> This information can be used by an IDE to highlight those lines.
>>> Then you could quickly navigate through your performance-critical
>>> loop and make sure it's clean of GC.
>>
>> That sounds like a much less invasive approach no a @nogc attribute.
>
> Problem is with functions that have no source available.
>
> Andrei


Mangle the @nogc it into the name?
October 09, 2013
On Tuesday, 8 October 2013 at 23:05:37 UTC, Walter Bright wrote:
> On 10/8/2013 12:34 PM, Jonathan M Davis wrote:
>> I think that it's clear that for some projects, it's critical to minimize the
>> GC, and I think that it's clear that we need to do a better job of supporting
>> the folks who want to minimize GC usage, but I also think that for the vast
>> majority of cases, complaints about the GC are way overblown. It becomes an
>> issue when you're doing a lot of heap allocations, but it's frequently easy to
>> design D code so that heap allocations are relatively rare such that they
>> aren't going to be a serious problem outside of code which is performance
>> critical to the point that it would be worrying about the cost of malloc
>> (which most code isn't). Personally, the only time that I've run into issues
>> with the GC is when trying to do use RedBlackTree with a lot of items. That
>> has a tendancy to tank performance.
>>
>> So, yes, it's problem. Yes, we need to improve the situaton. But for most
>> situations, I think that the concern about the GC is way overblown.
>
> +1
>
> Some years ago, a colleague of mine moonlighted teaching remedial algebra at the UW. She'd write on the board:
>
>    x + 2 = 5
>
> and call on a student to "solve for x". The student would collapse in a stuttering heap of jelly, emitting sparks and smoke like a Star Trek computer. She discovered that if she wrote instead:
>
>    _ + 2 = 5
>
> and would ask the same student what goes in the blank spot, he'd say "3" without hesitation.
>
> In other words, the student would only see the words "solve", "x", and "algebra" which were a shortcut in his brain to "I can't do this" and "gee math is hard." She found she was a far more effective teacher by avoiding using those words.
>
> I realized the same thing was happening with the word "template". I talked Andrei into avoiding all use of that word in "The D Programming Language", figuring that we could get people who were terrified of "templates" to use them successfully without realizing it (and I think this was very successful).
>
> We have a similar problem with "GC". People hear that word, and they are instantly turned off. No amount of education will change that. We simply have to find a better way to deal with this issue.

You remind me of a well-known Chinese fable.

At the time when Fan, a nobleman of the state of Jin, became a fugitive, a commoner found a bell and wanted to carry it off on his back. But the bell was too big for him. When he tried to knock it into pieces with a hammer there was a loud clanging sound. He was afraid that someone will hear the noise and take the bell from him, so he immediately stopped his own ears.
To worry about other people hearing the noise is understandable, but to worry about himself hearing the noise (as if stopping his own ears would prevent other people from hearing) is absurd.
October 09, 2013
On 9 October 2013 15:23, PauloPinto <pjmlp@progtools.org> wrote:

> On Wednesday, 9 October 2013 at 05:15:53 UTC, Manu wrote:
>
>> On 9 October 2013 08:58, ponce <contact@gmsfrommars.fr> wrote:
>>
>>  On Tuesday, 8 October 2013 at 22:45:51 UTC, Adam D. Ruppe wrote:
>>>
>>>
>>>> Eh, not necessarily. If it expands to static assert(!__traits(****
>>>> hasAnnotationRecursive,
>>>>
>>>> uses_gc));, then the only ones that *need* to be marked are the lowest level ones. Then it figures out the rest only on demand.
>>>>
>>>> Then, on the function you care about as a user, you say nogc and it
>>>> tells
>>>> you if you called anything and the static assert stacktrace tells you
>>>> where
>>>> it happened.
>>>>
>>>> Of course, to be convenient to use, phobos would need to offer
>>>> non-allocating functions, which is indeed a fair amount of work, but
>>>> they
>>>> wouldn't *necessarily* have to have the specific attribute.
>>>>
>>>>
>>> But is it even necessary? There isn't a great deal of evidence that someone interested in optimization will be blocked on this particular problem, like Peter Alexander said.
>>>
>>> GC hassle is quite common but not that big a deal:
>>> - Manu: "Consequently, I avoid the GC in D too, and never had any major
>>> problems, only inconvenience." http://www.reddit.com/r/**
>>> programming/comments/1nxs2i/****the_state_of_rust_08/ccnefe7<h**
>>> ttp://www.reddit.com/r/**programming/comments/1nxs2i/**
>>> the_state_of_rust_08/ccnefe7<http://www.reddit.com/r/programming/comments/1nxs2i/the_state_of_rust_08/ccnefe7>
>>> >
>>>
>>> - Dav1d: said he never had a GC problem with BRala (minecraft client)
>>> - Me: I had a small ~100ms GC pause in one of my games every 20 minutes,
>>> more often than not I don't notice it
>>>
>>> So a definitive written rebutal we can link to would perhaps be helpful.
>>>
>>>
>> I might just add, that while my experience has been that I haven't had any
>> significant technical problems when actively avoiding the GC, the
>> inconvenience is considerably more severe than I made out in that post (I
>> don't want to foster public negativity).
>> But it is actually really, really inconvenient. If that's my future with
>> D,
>> then I'll pass, just as any un-biased 3rd party would.
>>
>> I've been simmering on this issue ever since I took an interest in D. At
>> first I was apprehensive to accept the GC, then cautiously optimistic that
>> the GC might be okay. But I have seen exactly no movement in this area as
>> long as I've been following D, and I have since reverted to a position in
>> absolute agreement with the C++ users. I will never accept the GC in it's
>> current form for all of my occupational requirements; it's implicitly
>> non-deterministic, and offers very little control over performance
>> characteristics.
>> I've said before that until I can time-slice the GC, and it does not stop
>> the world, then it doesn't satisfy my requirements. I see absolutely no
>> motion towards that goal.
>> If I were one of those many C++ users evaluating D for long-term adoption
>> (and I am!), I'm not going to invest the future of my career and industry
>> in a complete question mark which given years of watching already, is
>> clearly going nowhere.
>> As far as the GC is concerned, with respect to realtime embedded software,
>> I'm out. I've completely lost faith. And it's going to take an awful lot
>> more to restore my faith again than before.
>>
>> What I want is an option to replace the GC with ARC, just like Apple did.
>> Clearly they came to the same conclusion, probably for exactly the same
>> reasons.
>> Apple have a policy of silky smooth responsiveness throughout the OS and
>> the entire user experience. They consider this a sign of quality and
>> professionalism.
>> As far as I can tell, they concluded that non-deterministic GC pauses were
>> incompatible with their goal. I agree.
>> I think their experience should be taken very seriously. They have a
>> successful platform on weak embedded hardware, with about a million
>> applications deployed.
>>
>> I've had a lot of conversations with a lot of experts, plenty of
>> conversations at dconf, and nobody could even offer me a vision for a GC
>> that is acceptable.
>> As far as I can tell, nobody I talked to really thinks a GC that doesn't
>> stop the world, which can be carefully scheduled/time-sliced (ie, an
>> incremental, thread-local GC, or whatever), is even possible.
>>
>> I'll take ARC instead. It's predictable, easy for all programmers who aren't experts on resource management to understand, and I have DIRECT control over it's behaviour and timing.
>>
>> But that's not enough, offering convenience while trying to avoid using
>> the
>> GC altogether is also very important. You should be able to write software
>> that doesn't allocate memory. It's quite hard to do in D today. There's
>> plenty of opportunity for improvement.
>>
>> I'm still keenly awaiting a more-developed presentation of Andrei's allocators system.
>>
>
>
>
> Apple dropped the GC and went ARC instead, because they never managed to make it work properly.
>
> It was full of corner cases, and the application could crash if those cases were not fully taken care of.
>
> Or course the PR message is "We dropped GC because ARC is better" and not "We dropped GC because we failed".
>
> Now having said this, of course D needs a better GC as the current one doesn't fulfill the needs of potential users of the language.
>

Well, I never read that article apparently... but that's possibly even more
of a concern if true.
Does anyone here REALLY believe that a bunch of volunteer contributors can
possibly do what apple failed to do with their squillions of dollars and
engineers?
I haven't heard anybody around here propose the path to an acceptable
solution. It's perpetually in the too-hard basket, hence we still have the
same GC as forever and it's going nowhere.


October 09, 2013
 GC works for some cases, but the global one size fits all GC that D uses is no good.



October 09, 2013
On Wednesday, 9 October 2013 at 06:05:52 UTC, dennis luehring wrote:
> Am 09.10.2013 07:23, schrieb PauloPinto:
>> Apple dropped the GC and went ARC instead, because they never
>> managed to make it work properly.
>>
>> It was full of corner cases, and the application could crash if
>> those cases were not fully taken care of.
>>
>> Or course the PR message is "We dropped GC because ARC is better"
>> and not "We dropped GC because we failed".
>>
>> Now having said this, of course D needs a better GC as the
>> current one doesn't fulfill the needs of potential users of the
>> language.
>
> the question is - could ARC be an option for automatic memory managment
> in D - so that the compiler generated ARC code when not using gc - but using gc-needed code?
>
> or is that a hard to reach goal due to gc-using+arc-using lib combine problems?

Personally I think ARC can only work properly if it is handled by the compiler, even if D is powerful enough to have them as library types.

ARC is too costly to have it increment/decrement counters in every pointer access, in time and cache misses.

Objective-C, Rust and ParaSail do it well, because ARC is built into the compiler, which can elide needless operations.

Library solutions like C++ and D suffer without compiler support.

Personally, I think it could be two step:

- Improve the GC, because it what most developers will care about anyway

- Make the D compilers aware of RefCounted and friends, to minimize memory accesses, for the developers that care about every ms they can extract from the hardware.


--
Paulo
October 09, 2013
On 10/9/2013 12:29 AM, Manu wrote:
> Does anyone here REALLY believe that a bunch of volunteer contributors can
> possibly do what apple failed to do with their squillions of dollars and engineers?
> I haven't heard anybody around here propose the path to an acceptable solution.
> It's perpetually in the too-hard basket, hence we still have the same GC as
> forever and it's going nowhere.

What do you propose?
October 09, 2013
On 9 October 2013 16:05, dennis luehring <dl.soluz@gmx.net> wrote:

> Am 09.10.2013 07:23, schrieb PauloPinto:
>
>  Apple dropped the GC and went ARC instead, because they never
>> managed to make it work properly.
>>
>> It was full of corner cases, and the application could crash if those cases were not fully taken care of.
>>
>> Or course the PR message is "We dropped GC because ARC is better" and not "We dropped GC because we failed".
>>
>> Now having said this, of course D needs a better GC as the current one doesn't fulfill the needs of potential users of the language.
>>
>
> the question is - could ARC be an option for automatic memory managment in D - so that the compiler generated ARC code when not using gc - but using gc-needed code?
>
> or is that a hard to reach goal due to gc-using+arc-using lib combine problems?
>

It sounds pretty easy to reach to me. Compiler generating inc/dec ref calls
can't possibly be difficult. An optimisation that simplifies redundant
inc/dec sequences doesn't sound hard either... :/
Is there more to it? Cleaning up circular references I guess... what does
Apple do?
It's an uncommon edge case, so there's gotta be heaps of room for efficient
solutions to that (afaik) one edge case. Are there others?


October 09, 2013
On Tuesday, 8 October 2013 at 17:47:54 UTC, Brad Anderson wrote:
> On Tuesday, 8 October 2013 at 16:29:38 UTC, ponce wrote:
>> On Tuesday, 8 October 2013 at 16:22:25 UTC, Dicebot wrote:
>>> It is not overblown. It is simply "@nogc" which is lacking but absolutely mandatory. Amount of hidden language allocations makes manually cleaning code of those via runtime asserts completely unreasonable for real project.
>>
>> Hidden language allocations:
>> - concatenation operator   ~
>> - homogeneous arguments   void (T[]... args)
>> - "real" closures that escapes
>> - array literals
>> - some phobos calls
>>
>> What else am I missing?
>> I don't see the big problem, and a small frac
tion of projects
>> will require a complete ban on GC allocation, right?
>
> Johannes Pfau's -vgc pull request[1] had a list of ones he was able to find. It's all allocations, not just hidden allocations:
>
> COV         // Code coverage enabled
> NEW         // User called new (and it's not placement new)
> ASSERT_USER // A call to assert. This usually throws, but can be overwritten
>             // by user
> SWITCH_USER // Called on switch error. This usually throws, but can be
>             // overwritten by user
> HIDDEN_USER // Called on hidden function error. This usually throws, but can
>             // be overwritten by user
> CONCAT      // a ~ b
> ARRAY       // array.length = value, literal, .dup, .idup, .sort
> APPEND      // a~= b
> AALITERAL   // ["a":1]
> CLOSURE
>
> 1. https://github.com/D-Programming-Language/dmd/pull/1886

The closure one is a problem. I think that returning a closure should use a different syntax from using a normal delegate. I doubt it's something you _ever_ want to do by accident.

It's a problem because you can't see at a glance if a function uses a closure or not. You have to inspect the entire function very carefully, checking all
code paths.





October 09, 2013
On Wednesday, 9 October 2013 at 07:29:30 UTC, Manu wrote:
> On 9 October 2013 15:23, PauloPinto <pjmlp@progtools.org> wrote:
>
>> On Wednesday, 9 October 2013 at 05:15:53 UTC, Manu wrote:
>>
>>> On 9 October 2013 08:58, ponce <contact@gmsfrommars.fr> wrote:
>>>
>>>  On Tuesday, 8 October 2013 at 22:45:51 UTC, Adam D. Ruppe wrote:
>>>>
>>>>
>>>>> Eh, not necessarily. If it expands to static assert(!__traits(****
>>>>> hasAnnotationRecursive,
>>>>>
>>>>> uses_gc));, then the only ones that *need* to be marked are the lowest
>>>>> level ones. Then it figures out the rest only on demand.
>>>>>
>>>>> Then, on the function you care about as a user, you say nogc and it
>>>>> tells
>>>>> you if you called anything and the static assert stacktrace tells you
>>>>> where
>>>>> it happened.
>>>>>
>>>>> Of course, to be convenient to use, phobos would need to offer
>>>>> non-allocating functions, which is indeed a fair amount of work, but
>>>>> they
>>>>> wouldn't *necessarily* have to have the specific attribute.
>>>>>
>>>>>
>>>> But is it even necessary? There isn't a great deal of evidence that
>>>> someone interested in optimization will be blocked on this particular
>>>> problem, like Peter Alexander said.
>>>>
>>>> GC hassle is quite common but not that big a deal:
>>>> - Manu: "Consequently, I avoid the GC in D too, and never had any major
>>>> problems, only inconvenience." http://www.reddit.com/r/**
>>>> programming/comments/1nxs2i/****the_state_of_rust_08/ccnefe7<h**
>>>> ttp://www.reddit.com/r/**programming/comments/1nxs2i/**
>>>> the_state_of_rust_08/ccnefe7<http://www.reddit.com/r/programming/comments/1nxs2i/the_state_of_rust_08/ccnefe7>
>>>> >
>>>>
>>>> - Dav1d: said he never had a GC problem with BRala (minecraft client)
>>>> - Me: I had a small ~100ms GC pause in one of my games every 20 minutes,
>>>> more often than not I don't notice it
>>>>
>>>> So a definitive written rebutal we can link to would perhaps be helpful.
>>>>
>>>>
>>> I might just add, that while my experience has been that I haven't had any
>>> significant technical problems when actively avoiding the GC, the
>>> inconvenience is considerably more severe than I made out in that post (I
>>> don't want to foster public negativity).
>>> But it is actually really, really inconvenient. If that's my future with
>>> D,
>>> then I'll pass, just as any un-biased 3rd party would.
>>>
>>> I've been simmering on this issue ever since I took an interest in D. At
>>> first I was apprehensive to accept the GC, then cautiously optimistic that
>>> the GC might be okay. But I have seen exactly no movement in this area as
>>> long as I've been following D, and I have since reverted to a position in
>>> absolute agreement with the C++ users. I will never accept the GC in it's
>>> current form for all of my occupational requirements; it's implicitly
>>> non-deterministic, and offers very little control over performance
>>> characteristics.
>>> I've said before that until I can time-slice the GC, and it does not stop
>>> the world, then it doesn't satisfy my requirements. I see absolutely no
>>> motion towards that goal.
>>> If I were one of those many C++ users evaluating D for long-term adoption
>>> (and I am!), I'm not going to invest the future of my career and industry
>>> in a complete question mark which given years of watching already, is
>>> clearly going nowhere.
>>> As far as the GC is concerned, with respect to realtime embedded software,
>>> I'm out. I've completely lost faith. And it's going to take an awful lot
>>> more to restore my faith again than before.
>>>
>>> What I want is an option to replace the GC with ARC, just like Apple did.
>>> Clearly they came to the same conclusion, probably for exactly the same
>>> reasons.
>>> Apple have a policy of silky smooth responsiveness throughout the OS and
>>> the entire user experience. They consider this a sign of quality and
>>> professionalism.
>>> As far as I can tell, they concluded that non-deterministic GC pauses were
>>> incompatible with their goal. I agree.
>>> I think their experience should be taken very seriously. They have a
>>> successful platform on weak embedded hardware, with about a million
>>> applications deployed.
>>>
>>> I've had a lot of conversations with a lot of experts, plenty of
>>> conversations at dconf, and nobody could even offer me a vision for a GC
>>> that is acceptable.
>>> As far as I can tell, nobody I talked to really thinks a GC that doesn't
>>> stop the world, which can be carefully scheduled/time-sliced (ie, an
>>> incremental, thread-local GC, or whatever), is even possible.
>>>
>>> I'll take ARC instead. It's predictable, easy for all programmers who
>>> aren't experts on resource management to understand, and I have DIRECT
>>> control over it's behaviour and timing.
>>>
>>> But that's not enough, offering convenience while trying to avoid using
>>> the
>>> GC altogether is also very important. You should be able to write software
>>> that doesn't allocate memory. It's quite hard to do in D today. There's
>>> plenty of opportunity for improvement.
>>>
>>> I'm still keenly awaiting a more-developed presentation of Andrei's
>>> allocators system.
>>>
>>
>>
>>
>> Apple dropped the GC and went ARC instead, because they never managed to
>> make it work properly.
>>
>> It was full of corner cases, and the application could crash if those
>> cases were not fully taken care of.
>>
>> Or course the PR message is "We dropped GC because ARC is better" and not
>> "We dropped GC because we failed".
>>
>> Now having said this, of course D needs a better GC as the current one
>> doesn't fulfill the needs of potential users of the language.
>>
>
> Well, I never read that article apparently... but that's possibly even more
> of a concern if true.
> Does anyone here REALLY believe that a bunch of volunteer contributors can
> possibly do what apple failed to do with their squillions of dollars and
> engineers?
> I haven't heard anybody around here propose the path to an acceptable
> solution. It's perpetually in the too-hard basket, hence we still have the
> same GC as forever and it's going nowhere.

I already provided that information in antoher discussion thread awhile ago,

http://forum.dlang.org/post/cntjtnvnrwgdoklvznnw@forum.dlang.org

It is easy for developers outside Objective-C world to believe in the ARC PR, without knowing what happened in the battlefield. :)

--
Paulo