On 17 April 2014 18:52, via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
On Thursday, 17 April 2014 at 08:22:32 UTC, Paulo Pinto wrote:
Of course it was sold at WWDC as "ARC is better than GC" and not as "ARC is better than the crappy GC implementation we have done".

I have never seen a single instance of a GC based system doing anything smooth in the realm of audio/visual real time performance without being backed by a non-GC engine.

You can get decent performance from GC backed languages on the higher level constructs on top of a low level engine. IMHO the same goes for ARC. ARC is a bit more predictable than GC. GC is a bit more convenient and less predictable.

I think D has something to learn from this:

1. Support for manual memory management is important for low level engines.

2. Support for automatic memory management is important for high level code on top of that.

The D community is torn because there is some idea that libraries should assume point 2 above and then be retrofitted to point 1. I am not sure if that will work out.

See, I just don't find managed memory incompatible with 'low level' realtime or embedded code, even on tiny microcontrollers in principle.
ARC would be fine in low level code, assuming the language supported it to the fullest of it's abilities. I'm confident that programmers would learn it's performance characteristics and be able to work effectively with it in very little time.
It's well understood, and predictable. You know exactly how it works, and precisely what the costs are. There are plenty of techniques to move any ref fiddling out of your function if you identify that to be the source of a bottleneck.

I think with some care and experience, you could use ARC just as effectively as full manual memory management in the inner loops, but also gain the conveniences it offers on the periphery where the performance isn't critical.
_Most_ code exists in this periphery, and therefore the importance of that convenience shouldn't be underestimated.


Maybe it is better to just say that structs are bound to manual memory management and classes are bound to automatic memory management.
Use structs for low level stuff with manual memory management.
Use classes for high level stuff with automatic memory management.

Then add language support for "union-based inheritance" in structs with a special construct for programmer-specified subtype identification.

That is at least conceptually easy to grasp and the type system can more easily safeguard code than in a mixed model.

No. It misses basically everything that compels the change. Strings, '~', closures. D largely depends on it's memory management. That's the entire reason why library solutions aren't particularly useful.
I don't want to see D evolve to another C++ where libraries/frameworks are separated or excluded by allocation practise.

Auto memory management in D is a reality. Unless you want to build yourself into a fully custom box (I don't!), then you have to deal with it. Any library that wasn't written by a gamedev will almost certainly rely on it, and games are huge complex things that typically incorporate lots of libraries. I've spent my entire adult lifetime dealing with these sorts of problems.


Most successful frameworks that allow high-level programming have two layers:
- Python/heavy duty c libraries
- Javascript/browser engine
- Objective-C/C and Cocoa / Core Foundation
- ActionScript / c engine

etc

I personally favour the more integrated approach that D appears to be aiming for, but I am somehow starting to feel that for most programmers that model is going to be difficult to grasp in real projects, conceptually. Because they don't really want the low level stuff. And they don't want to have their high level code bastardized by low level requirements.

As far as I am concerned D could just focus on the structs and the low level stuff, and then later try to work in the high level stuff. There is no efficient GC in sight and the language has not been designed for it either.

ARC with whole-program optimization fits better into the low-level paradigm than GC. So if you start from low-level programming and work your way up to high-level programming then ARC is a better fit.

The thing is, D is not particularly new, it's pretty much 'done', so there will be no radical change in direction like you seem to suggest.
But I generally agree with your final points.

The future is not manual memory management. But D seems to be pushing us back into that box without a real solution to this problem.
Indeed, it is agreed that there is no fantasy solution via GC on the horizon... so what?

Take this seriously. I want to see ARC absolutely killed dead rather than dismissed.