Jump to page: 1 2
Thread overview
Deterministic Memory Management With Standard Library Progress
Mar 04, 2017
Anthony
Mar 04, 2017
Dukc
Mar 04, 2017
cym13
Mar 05, 2017
Inquie
Mar 05, 2017
cym13
Mar 05, 2017
Anthony
Mar 05, 2017
Moritz Maxeiner
Mar 05, 2017
Anthony
Mar 05, 2017
Moritz Maxeiner
Mar 06, 2017
Anthony
Mar 07, 2017
Guillaume Piolat
Mar 06, 2017
Wyatt
Mar 06, 2017
Anthony
Mar 06, 2017
bachmeier
Mar 05, 2017
Guillaume Piolat
Mar 05, 2017
bachmeier
Mar 05, 2017
Guillaume Piolat
March 04, 2017
I've been having difficulty finding an up-to-date answer to this question, so I figured I'd ask the forum: can deterministic memory management be done in D, without losing any main features? I ask this because I know it's technically possible, but the methods that are suggested in the answers I've read always suggest the avoidance of most of the standard library. I know that there is an effort to reverse the reliance on the GC, but I don't know how far that's gotten in the last six months. Would anyone be able to give me more information?

To give context to my question, I don't have a problem with GCs, and this question isn't stemming from a C++ background. I've been told to learn C++ though, due to its efficiency and power.

It feels like D is a better choice for me to learn than C++; it's ostensibly C++ but with various amounts of baggage or unfavorable quirks removed. But I'm wary to learn if one of the major proponents for C++ usage, deterministic memory management, isn't directly supported. I check back here every few months or so, but now I just can't find anything new.
March 04, 2017
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
> [snip]

It can be done. C standard library, and thus malloc(), calloc() and free() come with the standard library. There also are more high-level was to do it, std.typecons.scoped, std.experimental.allocator and Dlib (a dub package) to name a few. You do not have to resign any parts of D standard library to do that.

That being said, manual memory management is recommended only if you have a specific reason to use it, because D compiler cannot verify memory safety of code doing such things. But remember that neither could C++. In D you can still at least have the compiler to verify those parts of the code where you don't manage memory manually. So definitely you're better off than in c++ in this regard.

Value types initialized directly on the stack are deterministicly destroyed without having to compromise @safe. (@safe means code which compiler verifies for memory safety. It isn't perfect, but close enough to catch almost all errors) Of course, not everything can be a value type because some data needs to have a variable size.

Scoped pointers is an upcoming feature which should, as I understand it, allow deterministic memory management with reference types too, in @sfe code. It is already implemented, but still seems to me to have too much unfinished corner cases to do its job yet. The standard library is not yet compilant with that feature. I believe, trough, that it will become much better.

That all assuming that "deterministic" means deterministic freeing of the memory and calling of destructors.
March 04, 2017
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
> I've been having difficulty finding an up-to-date answer to this question, so I figured I'd ask the forum: can deterministic memory management be done in D, without losing any main features? I ask this because I know it's technically possible, but the methods that are suggested in the answers I've read always suggest the avoidance of most of the standard library. I know that there is an effort to reverse the reliance on the GC, but I don't know how far that's gotten in the last six months. Would anyone be able to give me more information?
>
> To give context to my question, I don't have a problem with GCs, and this question isn't stemming from a C++ background. I've been told to learn C++ though, due to its efficiency and power.
>
> It feels like D is a better choice for me to learn than C++; it's ostensibly C++ but with various amounts of baggage or unfavorable quirks removed. But I'm wary to learn if one of the major proponents for C++ usage, deterministic memory management, isn't directly supported. I check back here every few months or so, but now I just can't find anything new.

Well, you said it, the key point is "without losing main features".

There's quite a chunk of the standard library that is @nogc, almost everything in std.algorithm for example (to the exception of one function IIRC, minor enough I can't remember which). Aside from that most modules providing some form of reading or encoding also provide two sets of methods: one that asks for an external buffer which you can use, the other that automatically allocates a buffer to alleviate the need to manually manage it (GC-ed).

The other problem with the GC in the standard library is the exception one: as we have GC-ed exceptions by default (to avoid having to manually free them) no function using exceptions can be @nogc. However one should realistically note that allocation of exceptions is and should stay exceptional so there are cases where you could decide to use a cast to force an exception-using function into a @nogc scope.

Finally GC problems are completely exagerated. It only runs when used so having it to manage exceptions only for example is completely viable, and it is possible to dereference threads if you don't want them to be seen by the GC. Using GC globally and avoiding it locally on heat points has proved to be a successful strategy within the community. So I wouldn't worry too much, giving it a try is paramount.
March 05, 2017
> Finally GC problems are completely exagerated. It only runs when used so having it to manage exceptions only for example is completely viable, and it is possible to dereference threads if you don't want them to be seen by the GC. Using GC globally and avoiding it locally on heat points has proved to be a successful strategy within the community. So I wouldn't worry too much, giving it a try is paramount.

Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC.

It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue.

It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.







March 05, 2017
On Sunday, 5 March 2017 at 00:06:04 UTC, Inquie wrote:
>> Finally GC problems are completely exagerated. It only runs when used so having it to manage exceptions only for example is completely viable, and it is possible to dereference threads if you don't want them to be seen by the GC. Using GC globally and avoiding it locally on heat points has proved to be a successful strategy within the community. So I wouldn't worry too much, giving it a try is paramount.
>
> Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC.
>
> It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue.
>
> It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.

I completely agree that there are cases where having the GC is a no-go and have even seen a number of projects doing fine without it. But exactly as you say, just because it is a problem for you doesn't mean it's a problem for everyone. Clearly the OP hasn't tried D yet so as far as I'm concerned it looks like his concern about GC come from critics he may have read elsewhere and not from an actual case that makes GC impossible to use. Given that I feel like conforting the fact that, indeed, it works great for most applications.
March 05, 2017
On Sunday, 5 March 2017 at 00:44:26 UTC, cym13 wrote:
> On Sunday, 5 March 2017 at 00:06:04 UTC, Inquie wrote:
>>> Finally GC problems are completely exagerated. It only runs when used so having it to manage exceptions only for example is completely viable, and it is possible to dereference threads if you don't want them to be seen by the GC. Using GC globally and avoiding it locally on heat points has proved to be a successful strategy within the community. So I wouldn't worry too much, giving it a try is paramount.
>>
>> Please stop making this argument. Just because one can amortize out the effect of the GC does not in any way change the fact that it is a stop the world GC and for certain applications this is provably a show stopper. One can come up with any number of commercial apps that fail using D's GC. Some applications require near real time behavior and D's GC does not provide any bounds on how long it runs... regardless of any average behavior(as that is meaningless when it only takes once to kill a person, create an audio glitch, etc). All these apps work fine with deterministic memory management or possibly other methods, but not D's GC.
>>
>> It is not a viable solution just because it is a solution for you. When you start writing these apps that simply do not function to the standards set by the customer and this is all due to D's GC, then you will experience why it is bad. Until then, you won't have a clue.
>>
>> It doesn't matter if it works 99.99% of the time, with these applications that may run for months at a time and have critical behavior constraints, 99.99% doesn't look that great. So, please don't push your ideals, motivations, experiences on others and just be honest with them. D's GC sucks for near real time applications, and it has problems. It one avoids the GC as best as they can using traditional techniques, it minimizes the effect of the GC and makes the app closer to real time.
>
> I completely agree that there are cases where having the GC is a no-go and have even seen a number of projects doing fine without it. But exactly as you say, just because it is a problem for you doesn't mean it's a problem for everyone. Clearly the OP hasn't tried D yet so as far as I'm concerned it looks like his concern about GC come from critics he may have read elsewhere and not from an actual case that makes GC impossible to use. Given that I feel like conforting the fact that, indeed, it works great for most applications.

I've learned the basics of D. I read the tutorial book, as I would call it, and some further tutorials on templates and other cool things. I just don't feel comfortable investing a significant effort acquainting myself further with the language without some guarantee that the feature will be completely supported eventually.

In a way, I'm picking a tool for my toolbelt, and C++ and D are competing tools. D looks like C++ 2.0, but it's missing a critical function of it as well. So, I'm conflicted.

I plan on answering the other answers, by the way. I just figured I'd wait a day or so to digest all the feedback together. But I do appreciate the effort from everyone so far, regardless of disagreements.
March 05, 2017
On Sunday, 5 March 2017 at 00:58:44 UTC, Anthony wrote:
> [...]
>
> I've learned the basics of D. I read the tutorial book, as I would call it, and some further tutorials on templates and other cool things. I just don't feel comfortable investing a significant effort acquainting myself further with the language without some guarantee that the feature will be completely supported eventually.

What do you consider complete support in this context? druntime, phobos, both? You can definitely write an application where all heap memory (after the druntime initialization) is allocated (and deallocated) deterministically, provided you don't use language builtins that require GC allocations (druntime) or stay away from other people's code that allocates using the GC (this includes those parts of phobos). std.experimental.allocator even provides a nice, generic interface for this (you'll want to use one of the allocators that aren't GCAllocator, though).
Considering D development is - AFAIK - not primarily driven by people paid for their work I doubt you'll get a guarantee on any future development, though.

>
> In a way, I'm picking a tool for my toolbelt, and C++ and D are competing tools.

If possible, don't pick one, pick both (and to be even more annoying: also pick some Lisp, Erlang, Haskell, and Rust to get exposed to many different types of abstraction).

> D looks like C++ 2.0, but it's missing a critical function of it as well. So, I'm conflicted.

If you're referring to deterministic memory management, it's not; the function is there, it's just up to you to actually use it and not invoke the GC.
If you're referring to not all of phobos' functions being compatible with deterministic memory management (as opposed to stdc++), then yes, that's an ongoing effort.


March 05, 2017
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
> To give context to my question, I don't have a problem with GCs, and this question isn't stemming from a C++ background. I've been told to learn C++ though, due to its efficiency and power.
>
> It feels like D is a better choice for me to learn than C++; it's ostensibly C++ but with various amounts of baggage or unfavorable quirks removed. But I'm wary to learn if one of the major proponents for C++ usage, deterministic memory management, isn't directly supported. I check back here every few months or so, but now I just can't find anything new.

Having learned C++ before D, I would argue (others will disagree) that even if your goal is to learn C++, you should start with D. You want to learn certain concepts and ways of thinking. If you start with C++, you have to spend your time learning the rough edges of C++ and what not to do, and it really does interfere with your learning. It's faster to learn D and then figure out how to do the same thing in C++ than to battle with the unpleasantness of C++ from the start. I regret all the hours I wasted on C++.

It's not really accurate to say someone should avoid D because of GC/memory management. You can call C++ from D, so while the cost-benefit analysis might favor C++ in some cases, there's no reason you can't write your program in D and then call into C++ when absolutely necessary. I do the same thing, except that I call into C more than C++.

Just my 2 cents as someone that does numerical programming. Ultimately, if you're looking at it from the perspective of the job market, it really doesn't make sense to spend time on D. If the goal is to learn, it doesn't make sense to spend time on C++.
March 05, 2017
On Sunday, 5 March 2017 at 01:41:47 UTC, Moritz Maxeiner wrote:
> On Sunday, 5 March 2017 at 00:58:44 UTC, Anthony wrote:
>> [...]
>>
>> I've learned the basics of D. I read the tutorial book, as I would call it, and some further tutorials on templates and other cool things. I just don't feel comfortable investing a significant effort acquainting myself further with the language without some guarantee that the feature will be completely supported eventually.
>
> What do you consider complete support in this context? druntime, phobos, both? You can definitely write an application where all heap memory (after the druntime initialization) is allocated (and deallocated) deterministically, provided you don't use language builtins that require GC allocations (druntime) or stay away from other people's code that allocates using the GC (this includes those parts of phobos). std.experimental.allocator even provides a nice, generic interface for this (you'll want to use one of the allocators that aren't GCAllocator, though).
> Considering D development is - AFAIK - not primarily driven by people paid for their work I doubt you'll get a guarantee on any future development, though.
>
>>
>> In a way, I'm picking a tool for my toolbelt, and C++ and D are competing tools.
>
> If possible, don't pick one, pick both (and to be even more annoying: also pick some Lisp, Erlang, Haskell, and Rust to get exposed to many different types of abstraction).
>
>> D looks like C++ 2.0, but it's missing a critical function of it as well. So, I'm conflicted.
>
> If you're referring to deterministic memory management, it's not; the function is there, it's just up to you to actually use it and not invoke the GC.
> If you're referring to not all of phobos' functions being compatible with deterministic memory management (as opposed to stdc++), then yes, that's an ongoing effort.

Not having it guaranteed is understandable, albeit slightly disappointing.

I would pick both, if I had the time to do so. I'm a college student; with that in mind, I can only really learn one right now without giving up most of my free time. I think it'd be stressful if I tried.

I was referring to phobos. I feel intimidated by the idea of trying to code some of the functions of phobos myself in a no-gc manner. I'm sure I'd run into things way out of my knowledge domain.
March 05, 2017
On Saturday, 4 March 2017 at 18:09:10 UTC, Anthony wrote:
> To give context to my question, I don't have a problem with GCs, and this question isn't stemming from a C++ background. I've been told to learn C++ though, due to its efficiency and power.

I think you should start with "What kind of programs do I want to write?" rather than what language to choose. Then pick the best language for that domain.

But if you want to learn C++, then starting with the basic C subset and add feature by feature from C++ is the best alternative. If learning C++ is your goal then you need to get to terms with well thought out memory management strategies.

Of the non-C++ languages that could give you some structure Rust is possibly one that could give you some training, as the Rust compiler enforce what you should try to achieve in C++ with unique_ptr (roughly the same memory model in principle).

Also, there are many variants of C++ (C++17/C++11, C++03, C++98...) which leads to very different programming idioms.

It takes many years to become proficient in C++. I would estimate that it will take 1-2 years from someone already proficient in C++98 to become proficient in C++17.

« First   ‹ Prev
1 2