February 04, 2014
Am 04.02.2014 01:24, schrieb NoUseForAName:
> On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:
>>  ur right I never thought of that, I bet all them game devs never
>> thought of it either, they so dumb.  I bet they never tried to use a
>> GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It
>> only runs when I allocate, oh what a fool I've been, please castigate
>> me harder!
>
> Also people should consider that Apple (unlike C++ game devs) did not
> have a tradition of contempt for GC. In fact they tried GC *before* they
> switched to ARC. The pro-GC camp always likes to pretend that the
> anti-GC one is just ignorant, rejecting GC based on prejudice not
> experience but Apple rejected GC based on experience.
>
> GCed Objective-C did not allow them to deliver the user experience they
> wanted (on mobile), because of the related latency issues. So they
> switched to automated ref counting. It is not in question that ref
> counting sacrifices throughput (compared to an advanced GC) but for
> interactive, user facing applications latency is much more important.

The reality of those hanging around Apple developer forums before ARC existed, was more related to crashes caused by Apple's implementation and their interaction with third party libraries. Specially the amount of special use cases one needed to take care of.

They *failed* to produce a stable GC implementation for Objective-C, that is the reason, not better performance.

--
Paulo
February 04, 2014
> By the way, while this statement was true for initial design, they have recently moved to much more simple model, replacing most of more complicated pointer types with library solutions. I think those who refer to Rust example are more likely to have in mind that new model and your judgement seems to be based on previous one.
In my opinion the new model is even harder. As you can only put immutable data in their Gc/Rc structures you end up with things like Rc<RefMut<Type>> or Rc<Cell<Type>> and must exactly understand what each of those does. For me already the library solution in C++ (shared_ptr<Type>) is somewhat annoying to type when you compare that to the current D solution or other managed languages. Dereferencing these types in Rust is also very hard at the moment, but they plan do get it easier.

Freely mixing ARC and GC is also not that easy in my opinion: As soon as you have a garbage collected object anywhere in your object hierarchy basically anything beneath that will be also "garbage collected" - in the sense of only deallocated when the GC runs. And if you used RAII semantics in some of your ARC managed objects you might wonder why their resources are not immediatly releasd, but also when the GC runs.
February 04, 2014
On 2/4/14, 1:59 AM, Don wrote:
> We're using D as a systems language on a global commercial scale. And
> no, we don't use malloc/free. We just don't use Phobos.

What do you use instead of malloc/free?

Andrei
February 04, 2014
On 2/4/14, 6:26 AM, Frank Bauer wrote:
> On Tuesday, 4 February 2014 at 06:47:03 UTC, Walter Bright wrote:
>> On 2/3/2014 7:03 PM, Adam Wilson wrote:
>>> Note that ObjC has special syntax to handle weak pointers. It's not well
>>> understood by many.
>>
>> Sounds like explicitly managed memory is hardly worse.
>>
>> On 2/3/2014 3:13 PM, woh wrote:
>>>
>>>  Any system that forces a single way of handling memory as the only
>>> viable
>>> method, be it GC( as D currently does)
>>
>> This is incorrect. You can use malloc/free in D, as well as write &
>> use your own allocators, and even write your own ref counted types.
>>
>> On 2/3/2014 1:42 PM, Shammah Chancellor wrote:
>>> It's also probably
>>> possible to create a drop-in replacement for the GC to do something
>>> else.
>>
>> It certainly is possible. There's nothing magic about the current GC,
>> it's just library code.
>
>
> Andrei Alexandrescu wrote:
>> 2. Work on Phobos to see what can be done about avoiding unnecessary
>> allocation. Most likely we'll need to also add a @nogc flag.
>> ...
>> 4. Work on the core language and druntime to see how to seamlessly
>> accommodate alternate GC mechanisms such as reference counting.
>> ...
>> I thought I made it clear that GC avoidance (which includes
>> considering built-in reference counting) is a major focus of 2014.
>>
>> Andrei
>
> Im totally confused: Walter, do you back what Andrei says or do we have
> a good cop / bad cop situation here?

There's no contradiction. Walter and I are on the same page.

Andrei

February 04, 2014
On 2/4/14, 6:45 AM, Dicebot wrote:
> On Tuesday, 4 February 2014 at 14:22:41 UTC, Paulo Pinto wrote:
>> Rust still remains to prove itself on the mainstream market.
>>
>> While D already has commercial users.
>
> It is dangerous position. D is not that far ahead to appeal to own
> authority.

I agree.

Andrei

February 04, 2014
On Mon, 03 Feb 2014 15:52:53 -0500, Adam Wilson <flyboynw@gmail.com> wrote:

> That said, I firmly believe that wholesale replacement of the GC is throwing the baby out with the bathwater. Effectively, the poor D GC implementation has become an excuse to launch a crusade against all GC's everywhere, never mind that Java and the .NET GC's are consistent examples of just how good GC's can actually be.

AFAIK, Java's and .NET's GCs require precise collection. This is not possible in D (at least not fully). While I think we need to move more towards precise collection, we will never be as good as languages which have that restriction. Some GC options just aren't available.

I think ARC is good for many (most?) applications, and GC is good for many as well. We should have an option of both.

I still look forward to the day when I can write iPhone apps in D instead of objective-C :)

-Steve
February 04, 2014
On Tuesday, 4 February 2014 at 09:59:07 UTC, Don wrote:
> We're using D as a systems language on a global commercial scale. And no, we don't use malloc/free. We just don't use Phobos.

With all respect I don't consider Sociomantic projects examples of systems-level programming (now that I have had a chance to inspect those closely ;)). Soft real-time performance critical system is a better description.
February 04, 2014
On Mon, 03 Feb 2014 19:49:04 -0500, Adam Wilson <flyboynw@gmail.com> wrote:

> On Mon, 03 Feb 2014 16:24:52 -0800, NoUseForAName <no@spam.com> wrote:
>
>> On Monday, 3 February 2014 at 23:00:23 UTC, woh wrote:
>>>  ur right I never thought of that, I bet all them game devs never thought of it either, they so dumb.  I bet they never tried to use a GC, what fools!  Endless graphs of traced objects, oh yes oh yes!  It only runs when I allocate, oh what a fool I've been, please castigate me harder!
>>
>> Also people should consider that Apple (unlike C++ game devs) did not have a tradition of contempt for GC. In fact they tried GC *before* they switched to ARC. The pro-GC camp always likes to pretend that the anti-GC one is just ignorant, rejecting GC based on prejudice not experience but Apple rejected GC based on experience.
>>
>> GCed Objective-C did not allow them to deliver the user experience they wanted (on mobile), because of the related latency issues. So they switched to automated ref counting. It is not in question that ref counting sacrifices throughput (compared to an advanced GC) but for interactive, user facing applications latency is much more important.
>>
>
> That may be the case, but StackOverflow shows that ARC hasn't been panacea in Apple land either. Way to many people don't understand ARC and how to use it, and subsequently beg for help understanding heisenleaks and weak references. ARC places a higher cognitive load on the programmer than a GC does. And Android runs just fine with GC'ed apps, but ARC guys don't want to talk about Google's successes there.

Where you have to be cognizant is avoiding cycles. Plain and simple. And it's not that difficult. The compiler takes care of the rest. It's somewhat magical I suppose :) Managing your autorelease pools can make a difference, but is not necessarily critical.

I think when working on embedded systems, it is quite important to understand the limitations and strengths of the language/hardware you are using, much more so than on a full PC/Server. Where ARC really beats the GC is on memory usage. It's very very close to malloc/free usage.

I would love to see D attain an ARC system, especially if it can link seamlessly with Objective-C. What I don't know is if one can replace a GC with an ARC memory manager without having to port any high-level code. I don't think that is the case. Which leads me to think -- is it even possible to write druntime/phobos in an agnostic way? If it isn't, what subset can be done that way?

-Steve
February 04, 2014
On Tuesday, 4 February 2014 at 18:19:56 UTC, Matthias Einwag wrote:
> In my opinion the new model is even harder. ... <skip>

I also find exact implementation considerably over-engineered but I would never propose to just copy stuff to Rust as-is. What I do propose to is to acknowledge general principle - build base language upon allocation-ignorant primitives that provide ownership semantics and move all advanced memory management to the library. I sincerely believe that `std .allocator` alone as presented by Andrei will give D a huge edge when exploring similar approach.
February 04, 2014
On 2/4/14, 11:28 AM, Steven Schveighoffer wrote:
> Where you have to be cognizant is avoiding cycles. Plain and simple. And
> it's not that difficult.

Do you have evidence to back that up?

Andrei