February 01, 2014
Yes, that is what Rust calls an "owning pointer". While we are at it, we might as well dream of "non-owning" or "borrowed" pointers, a.k.a. references in Rust. They don't have an effect on memory-deallocation when they go out of scope, but they prevent the owning pointer they "borrow" from (i.e. are assigned from) from being reassigned to a different object as long as there are borrowing references in scope. There is more to owning and borrowed pointers, but I think that is the essence.

@Andrei (who I believe is the go-to (!) guy for all things memeory allocation related right now):
IIRC you mentioned that it was convenient to have the GC around for implementing the D language features. Would it just be a minor inconvenience to drop that dependency in the generated compiler output and replace it with new/delete or something equivalent to owning pointers, say over the next one or two years? Or would it be a major roadblock that would require too much redesign work? Maybe you could test the concept of owning and borrowed pointers internally for some compiler components before actually bringing them "to the surface" for us to play with, if it turns out useful. But only of course, if we could leave the rest of D as good as it is today.

I would really like to see the GC-free requesting crowd be taken a lttle more seriously. Without asking them to forego D features or do manual C++-style memory management. Javas and .NETs garbage collectors had enough time to mature and are still unsatisfactory for many applications.
February 01, 2014
On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:

>  std.typecons.RefCounted!T
>>
>> core.memory.GC.disable();
>>
>
> Wow. That was easy.
>
> I see, D's claim of being a multi-paradigm language is not false.


It's not a realistic suggestion. Everything you want to link uses the GC, and the language its self also uses the GC. Unless you write software in complete isolation and forego many valuable features, it's not a solution.


 Phobos does rely on the GC to some extent. Most algorithms and ranges do
>> not though.
>>
>
> Running (library) code that was written with GC in mind and turning GC off
> doesn't sound ideal.
>
> But maybe this allows me to familiarise myself more with D. Who knows, maybe I can learn to stop worrying and love garbage collection.
>
> Thanks for your help!
>

I've been trying to learn to love the GC for as long as I've been around
here. I really wanted to break that mental barrier, but it hasn't happened.
In fact, I am more than ever convinced that the GC won't do. My current #1
wishlist item for D is the ability to use a reference counted collector in
place of the built-in GC.
You're not alone :)

I write realtime and memory-constrained software (console games), and for
me, I think the biggest issue that can never be solved is the
non-deterministic nature of the collect cycles, and the unknowable memory
footprint of the application. You can't make any guarantees or predictions
about the GC, which is fundamentally incompatible with realtime software.
Language-level ARC would probably do quite nicely for the miscellaneous
allocations. Obviously, bulk allocations are still usually best handled in
a context sensitive manner; ie, regions/pools/freelists/whatever, but the
convenience of the GC paradigm does offer some interesting and massively
time-saving features to D.
Everyone will always refer you to RefCounted, which mangles your types and
pollutes your code, but aside from that, for ARC to be useful, it needs to
be supported at the language-level, such that the language/optimiser is
able to optimise out redundant incref/decref calls, and also that it is
compatible with immutable (you can't manage a refcount if the object is
immutable).


February 01, 2014
On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:

> On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:
>
>>  std.typecons.RefCounted!T
>>>
>>> core.memory.GC.disable();
>>>
>>
>> Wow. That was easy.
>>
>> I see, D's claim of being a multi-paradigm language is not false.
>
>
> It's not a realistic suggestion. Everything you want to link uses the GC,
> and the language its self also uses the GC. Unless you write software in
> complete isolation and forego many valuable features, it's not a solution.
>
>
>  Phobos does rely on the GC to some extent. Most algorithms and ranges do
>>> not though.
>>>
>>
>> Running (library) code that was written with GC in mind and turning GC off
>> doesn't sound ideal.
>>
>> But maybe this allows me to familiarise myself more with D. Who knows,
>> maybe I can learn to stop worrying and love garbage collection.
>>
>> Thanks for your help!
>>
>
> I've been trying to learn to love the GC for as long as I've been around
> here. I really wanted to break that mental barrier, but it hasn't happened.
> In fact, I am more than ever convinced that the GC won't do. My current #1
> wishlist item for D is the ability to use a reference counted collector in
> place of the built-in GC.
> You're not alone :)
>
> I write realtime and memory-constrained software (console games), and for
> me, I think the biggest issue that can never be solved is the
> non-deterministic nature of the collect cycles, and the unknowable memory
> footprint of the application. You can't make any guarantees or predictions
> about the GC, which is fundamentally incompatible with realtime software.
> Language-level ARC would probably do quite nicely for the miscellaneous
> allocations. Obviously, bulk allocations are still usually best handled in
> a context sensitive manner; ie, regions/pools/freelists/whatever, but the
> convenience of the GC paradigm does offer some interesting and massively
> time-saving features to D.
> Everyone will always refer you to RefCounted, which mangles your types and
> pollutes your code, but aside from that, for ARC to be useful, it needs to
> be supported at the language-level, such that the language/optimiser is
> able to optimise out redundant incref/decref calls, and also that it is
> compatible with immutable (you can't manage a refcount if the object is
> immutable).

The problem isn't GC's per se. But D's horribly naive implementation, games are written on GC languages now all the time (Unity/.NET). And let's be honest, games are kind of a speciality, games do things most programs will never do.

You might want to read the GC Handbook. GC's aren't bad, but most, like the D GC, are just to simplistic for common usage today.

-- 
Adam Wilson
GitHub/IRC: LightBender
Aurora Project Coordinator
February 01, 2014
On 12/31/2012 7:14 AM, Sven Over wrote:
>
> In my job I'm writing backend services that power a big web site.
> Perfomance is key, as the response time of the data service in most
> cases directly adds to the page load time. The bare possibility that the
> whole service pauses for, say, 100ms is making me feel very uncomfortable.
>

FWIW, Vibe.d <http://vibed.org/> is very careful about avoiding unnecessary GC allocations.

February 01, 2014
On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com> wrote:

> On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:
>
>  On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:
>>
>>   std.typecons.RefCounted!T
>>>
>>>>
>>>> core.memory.GC.disable();
>>>>
>>>>
>>> Wow. That was easy.
>>>
>>> I see, D's claim of being a multi-paradigm language is not false.
>>>
>>
>>
>> It's not a realistic suggestion. Everything you want to link uses the GC, and the language its self also uses the GC. Unless you write software in complete isolation and forego many valuable features, it's not a solution.
>>
>>
>>  Phobos does rely on the GC to some extent. Most algorithms and ranges do
>>
>>> not though.
>>>>
>>>>
>>> Running (library) code that was written with GC in mind and turning GC
>>> off
>>> doesn't sound ideal.
>>>
>>> But maybe this allows me to familiarise myself more with D. Who knows, maybe I can learn to stop worrying and love garbage collection.
>>>
>>> Thanks for your help!
>>>
>>>
>> I've been trying to learn to love the GC for as long as I've been around
>> here. I really wanted to break that mental barrier, but it hasn't
>> happened.
>> In fact, I am more than ever convinced that the GC won't do. My current #1
>> wishlist item for D is the ability to use a reference counted collector in
>> place of the built-in GC.
>> You're not alone :)
>>
>> I write realtime and memory-constrained software (console games), and for
>> me, I think the biggest issue that can never be solved is the
>> non-deterministic nature of the collect cycles, and the unknowable memory
>> footprint of the application. You can't make any guarantees or predictions
>> about the GC, which is fundamentally incompatible with realtime software.
>> Language-level ARC would probably do quite nicely for the miscellaneous
>> allocations. Obviously, bulk allocations are still usually best handled in
>> a context sensitive manner; ie, regions/pools/freelists/whatever, but the
>> convenience of the GC paradigm does offer some interesting and massively
>> time-saving features to D.
>> Everyone will always refer you to RefCounted, which mangles your types and
>> pollutes your code, but aside from that, for ARC to be useful, it needs to
>> be supported at the language-level, such that the language/optimiser is
>> able to optimise out redundant incref/decref calls, and also that it is
>> compatible with immutable (you can't manage a refcount if the object is
>> immutable).
>>
>
> The problem isn't GC's per se. But D's horribly naive implementation, games are written on GC languages now all the time (Unity/.NET). And let's be honest, games are kind of a speciality, games do things most programs will never do.
>
> You might want to read the GC Handbook. GC's aren't bad, but most, like the D GC, are just to simplistic for common usage today.


Maybe a sufficiently advanced GC could address the performance
non-determinism to an acceptable level, but you're still left with the
memory non-determinism, and the conundrum that when your heap approaches
full (which is _always_ on a games console), the GC has to work harder and
harder, and more often to try and keep the tiny little bit of overhead
available.
A GC heap by nature expects you to have lots of memory, and also lots of
FREE memory.

No serious console game I'm aware of has ever been written in a language with a GC. Casual games, or games that don't attempt to raise the bar may get away with it, but that's not the industry I work in.


February 01, 2014
Am 01.02.2014 05:33, schrieb Frank Bauer:
> Yes, that is what Rust calls an "owning pointer". While we are at it, we
> might as well dream of "non-owning" or "borrowed" pointers, a.k.a.
> references in Rust. They don't have an effect on memory-deallocation
> when they go out of scope, but they prevent the owning pointer they
> "borrow" from (i.e. are assigned from) from being reassigned to a
> different object as long as there are borrowing references in scope.
> There is more to owning and borrowed pointers, but I think that is the
> essence.
>
> @Andrei (who I believe is the go-to (!) guy for all things memeory
> allocation related right now):
> IIRC you mentioned that it was convenient to have the GC around for
> implementing the D language features. Would it just be a minor
> inconvenience to drop that dependency in the generated compiler output
> and replace it with new/delete or something equivalent to owning
> pointers, say over the next one or two years? Or would it be a major
> roadblock that would require too much redesign work? Maybe you could
> test the concept of owning and borrowed pointers internally for some
> compiler components before actually bringing them "to the surface" for
> us to play with, if it turns out useful. But only of course, if we could
> leave the rest of D as good as it is today.
>
> I would really like to see the GC-free requesting crowd be taken a lttle
> more seriously. Without asking them to forego D features or do manual
> C++-style memory management. Javas and .NETs garbage collectors had
> enough time to mature and are still unsatisfactory for many applications.

Just to nitpick, reference counting is also GC from CS point of view. :)
February 01, 2014
Am 01.02.2014 06:29, schrieb Manu:
> On 26 December 2012 00:48, Sven Over <dlang@svenover.de
> <mailto:dlang@svenover.de>> wrote:
>
>         std.typecons.RefCounted!T
>
>         core.memory.GC.disable();
>
>
>     Wow. That was easy.
>
>     I see, D's claim of being a multi-paradigm language is not false.
>
>
> It's not a realistic suggestion. Everything you want to link uses the
> GC, and the language its self also uses the GC. Unless you write
> software in complete isolation and forego many valuable features, it's
> not a solution.
>
>
>         Phobos does rely on the GC to some extent. Most algorithms and
>         ranges do not though.
>
>
>     Running (library) code that was written with GC in mind and turning
>     GC off doesn't sound ideal.
>
>     But maybe this allows me to familiarise myself more with D. Who
>     knows, maybe I can learn to stop worrying and love garbage collection.
>
>     Thanks for your help!
>
>
> I've been trying to learn to love the GC for as long as I've been around
> here. I really wanted to break that mental barrier, but it hasn't happened.
> In fact, I am more than ever convinced that the GC won't do. My current
> #1 wishlist item for D is the ability to use a reference counted
> collector in place of the built-in GC.
> You're not alone :)
>
> I write realtime and memory-constrained software (console games), and
> for me, I think the biggest issue that can never be solved is the
> non-deterministic nature of the collect cycles, and the unknowable
> memory footprint of the application. You can't make any guarantees or
> predictions about the GC, which is fundamentally incompatible with
> realtime software.


Meanwhile Unity and similar engines are becoming widespread, with C++ being pushed all the way to the bottom on the stack.

At least from what I hear in the gaming communities I hop around.

What is your experience there?

--
Paulo

February 01, 2014
On Saturday, 1 February 2014 at 04:33:23 UTC, Frank Bauer wrote:
> Yes, that is what Rust calls an "owning pointer". While we are at it, we might as well dream of "non-owning" or "borrowed" pointers, a.k.a. references in Rust. They don't have an effect on memory-deallocation when they go out of scope, but they prevent the owning pointer they "borrow" from (i.e. are assigned from) from being reassigned to a different object as long as there are borrowing references in scope. There is more to owning and borrowed pointers, but I think that is the essence.

I think D can use simplified model with only `scope`. Using it as storage class will effectively make data owned (not necessarily pointer), using it as qualifier will allow to "borrow" scope reference/pointer, for example, to pass as function argument.
February 01, 2014
On Fri, 31 Jan 2014 23:35:44 -0800, Manu <turkeyman@gmail.com> wrote:

> On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com> wrote:
>
>> On Fri, 31 Jan 2014 21:29:04 -0800, Manu <turkeyman@gmail.com> wrote:
>>
>>  On 26 December 2012 00:48, Sven Over <dlang@svenover.de> wrote:
>>>
>>>   std.typecons.RefCounted!T
>>>>
>>>>>
>>>>> core.memory.GC.disable();
>>>>>
>>>>>
>>>> Wow. That was easy.
>>>>
>>>> I see, D's claim of being a multi-paradigm language is not false.
>>>>
>>>
>>>
>>> It's not a realistic suggestion. Everything you want to link uses the GC,
>>> and the language its self also uses the GC. Unless you write software in
>>> complete isolation and forego many valuable features, it's not a solution.
>>>
>>>
>>>  Phobos does rely on the GC to some extent. Most algorithms and ranges do
>>>
>>>> not though.
>>>>>
>>>>>
>>>> Running (library) code that was written with GC in mind and turning GC
>>>> off
>>>> doesn't sound ideal.
>>>>
>>>> But maybe this allows me to familiarise myself more with D. Who knows,
>>>> maybe I can learn to stop worrying and love garbage collection.
>>>>
>>>> Thanks for your help!
>>>>
>>>>
>>> I've been trying to learn to love the GC for as long as I've been around
>>> here. I really wanted to break that mental barrier, but it hasn't
>>> happened.
>>> In fact, I am more than ever convinced that the GC won't do. My current #1
>>> wishlist item for D is the ability to use a reference counted collector in
>>> place of the built-in GC.
>>> You're not alone :)
>>>
>>> I write realtime and memory-constrained software (console games), and for
>>> me, I think the biggest issue that can never be solved is the
>>> non-deterministic nature of the collect cycles, and the unknowable memory
>>> footprint of the application. You can't make any guarantees or predictions
>>> about the GC, which is fundamentally incompatible with realtime software.
>>> Language-level ARC would probably do quite nicely for the miscellaneous
>>> allocations. Obviously, bulk allocations are still usually best handled in
>>> a context sensitive manner; ie, regions/pools/freelists/whatever, but the
>>> convenience of the GC paradigm does offer some interesting and massively
>>> time-saving features to D.
>>> Everyone will always refer you to RefCounted, which mangles your types and
>>> pollutes your code, but aside from that, for ARC to be useful, it needs to
>>> be supported at the language-level, such that the language/optimiser is
>>> able to optimise out redundant incref/decref calls, and also that it is
>>> compatible with immutable (you can't manage a refcount if the object is
>>> immutable).
>>>
>>
>> The problem isn't GC's per se. But D's horribly naive implementation,
>> games are written on GC languages now all the time (Unity/.NET). And let's
>> be honest, games are kind of a speciality, games do things most programs
>> will never do.
>>
>> You might want to read the GC Handbook. GC's aren't bad, but most, like
>> the D GC, are just to simplistic for common usage today.
>
>
> Maybe a sufficiently advanced GC could address the performance
> non-determinism to an acceptable level, but you're still left with the
> memory non-determinism, and the conundrum that when your heap approaches
> full (which is _always_ on a games console), the GC has to work harder and
> harder, and more often to try and keep the tiny little bit of overhead
> available.
> A GC heap by nature expects you to have lots of memory, and also lots of
> FREE memory.
>
> No serious console game I'm aware of has ever been written in a language
> with a GC. Casual games, or games that don't attempt to raise the bar may
> get away with it, but that's not the industry I work in.

That's kind of my point. You're asking for massive changes throughout the entire compiler to support what is becoming more of an edge case, not less of one. For the vast majority of use cases, a GC is the right call and D has to cater to the majority if it wants to gain any significant mindshare at all. You don't grow by increasing specialization...

-- 
Adam Wilson
GitHub/IRC: LightBender
Aurora Project Coordinator
February 01, 2014
Am 01.02.2014 08:35, schrieb Manu:
> On 1 February 2014 16:26, Adam Wilson <flyboynw@gmail.com
> ...
>
> No serious console game I'm aware of has ever been written in a language
> with a GC. Casual games, or games that don't attempt to raise the bar
> may get away with it, but that's not the industry I work in.

Not sure how much serious you consider The Witcher 2 for the XBox 360, they used a GC in their engine.

http://www.makinggames.de/index.php/magazin/2155_porting_the_witcher_2_on_xbox_360

Yes, they faced some issues with it, but it wasn't thrown away, rather optimized how it was being used.

--
Paulo