May 16, 2021

On Sunday, 16 May 2021 at 04:32:31 UTC, Ola Fosheim Grostad wrote:

>

On Sunday, 16 May 2021 at 03:58:15 UTC, Mike Parker wrote:

>

A global bump allocator is just another word for a memory leak. You dont need a comp sci degree to understand that this is not a good "feature".

In this case, it's not a leak. DMD does its business and exits. I'm not sure what your point is.

May 16, 2021

On Sunday, 16 May 2021 at 04:37:52 UTC, Mike Parker wrote:

>

On Sunday, 16 May 2021 at 04:32:31 UTC, Ola Fosheim Grostad wrote:

>

On Sunday, 16 May 2021 at 03:58:15 UTC, Mike Parker wrote:

>

A global bump allocator is just another word for a memory leak. You dont need a comp sci degree to understand that this is not a good "feature".

In this case, it's not a leak. DMD does its business and exits. I'm not sure what your point is.

WTF? All programs exit. Most programs try to play nice so that the user can get the most out of his computer.

The main purpose of implementing a language in itself is to force the language designers to eat their own pudding. Secondary goal is to show case language features.

The proof of the pudding is in the eating. If you yourself refuse to eat it, that will be viewed as the pudding not being ready for consumption. The first thing a seasoned programmer will look at is how the compiler was implemented using the language features.

Memory management in compilers isnt more complicated than for other system software and does only take a tiny fraction of compilation times.

There are many good reasons to eat your own pudding. If you dont, people will assume itsnt edible.

May 16, 2021

On Sunday, 16 May 2021 at 05:35:30 UTC, Ola Fosheim Grostad wrote:

>

WTF? All programs exit. Most programs try to play nice so that the user can get the most out of his computer.

Yeah, and...? dmd a very short-running program, and its allocated memory is released when it exits.

>

There are many good reasons to eat your own pudding. If you dont, people will assume itsnt edible.

You're acting as if D can only be used with the GC. dmd is written in D, so we are eating our own pudding. Walter is very focused on making compiles as fast as possible, and he determined that releasing memory and using the GC prevent him from meeting that goal. And he can still write his program in D.

The GC is not a panacea. It's not suitable for every use case or performance goal. And that's perfectly fine.

May 16, 2021

On Sunday, 16 May 2021 at 05:50:12 UTC, Mike Parker wrote:

>

Yeah, and...? dmd a very short-running program, and its allocated memory is released when it exits.

No, it isnt very short-running. And that is not a valid argument either, as the user may have other tasks that need the memory.

To keep it short:

Programs that acquire significantly more resources than needed are misbehaving. Never met anyone disagreeing with this view.

Memory management is insignificant time wise if you have a solution suitable for systemsprogramming.

If D does not have a memory management solution suitable for DMD, then D does not have a solution for systems level programming, yet...

If this is difficult to accept, then there is no hope of improvements.

>

The GC is not a panacea. It's not suitable for every use case or performance goal. And that's perfectly fine.

If GC is suitable for anything system level, it would be a batch program like a compiler.

May 16, 2021

On Sunday, 16 May 2021 at 06:44:09 UTC, Ola Fosheim Grostad wrote:

>

On Sunday, 16 May 2021 at 05:50:12 UTC, Mike Parker wrote:

>

Yeah, and...? dmd a very short-running program, and its allocated memory is released when it exits.

No, it isnt very short-running. And that is not a valid argument either, as the user may have other tasks that need the memory.

To keep it short:

Programs that acquire significantly more resources than needed are misbehaving. Never met anyone disagreeing with this view.

Memory management is insignificant time wise if you have a solution suitable for systemsprogramming.

If D does not have a memory management solution suitable for DMD, then D does not have a solution for systems level programming, yet...

If this is difficult to accept, then there is no hope of improvements.

>

The GC is not a panacea. It's not suitable for every use case or performance goal. And that's perfectly fine.

If GC is suitable for anything system level, it would be a batch program like a compiler.

This is the big difference I see here, everyone keeps discussing how much GC matters, while other language communities, regardless of what others think just keep doing their stuff.

https://www.f-secure.com/en/consulting/foundry/usb-armory

https://tinygo.org/

https://makecode.com/language

https://www.microej.com/

https://www.wildernesslabs.co/

Now you can argue that Go, Python, JavaScript, Java, .NET aren't suitable for systems programming, that people are doing a big mistake adopting them, that they are just subsets of the actual languages, pick your argument against them.

However at the end of the day hardware is being sold, and developers are making money with such products, regardless of their suitability for systems programming or whatever else.

Meanwhile over here, yet another GC discussion round starts, or what features to implement that should be the next great thing, and then one wonders why adoption isn't working out.

May 16, 2021

On Sunday, 16 May 2021 at 06:44:09 UTC, Ola Fosheim Grostad wrote:

>

If this is difficult to accept, then there is no hope of improvements.

This has nothing to do with D. The C version of the D compiler also did not release memory. He has written in these forums about why he made that decision, and it has absolutely nothing to do with memory management options in any language. It was a conscious decision to pay for faster compile times with increased memory usage.

If you think that's a bad decision, fine. But you're flat out wrong that it says anything about D.

>

If GC is suitable for anything system level, it would be a batch program like a compiler.

And no one says it isn't. That Walter chose not to use the GC says nothing at all about its suitability for compilers. Of course someone who decided the cost of free is too high would choose not to use the GC!

Someone who is not Walter might choose to take the hit on compile times to pay for the benefits of the GC. Then, of course, they would write their code with that in mind and optimize it for GC usage. D gives them the choice to make the right decision for their specific use case to meet their specific goals.

You're extrapolating a broad point from a single example based on a flawed interpretation.

May 16, 2021

On Sunday, 16 May 2021 at 07:25:30 UTC, Mike Parker wrote:

>

And no one says it isn't. That Walter chose not to use the GC says nothing at all about its suitability for compilers. Of course someone who decided the cost of free is too high would choose not to use the GC!

Thats not obvious at all. In batch a GC can perform better than naive free if it is possible to give the compiler good information about where and when to scan.

But if people who make language decisions dont struggle with memory management in their coding practice then there is less likely to be significant improvements in this feature set.

May 16, 2021

On Sunday, 16 May 2021 at 07:18:54 UTC, Paulo Pinto wrote:

>

Now you can argue that Go, Python, JavaScript, Java, .NET aren't suitable for systems programming, that people are doing a big mistake adopting them, that they are just subsets of the actual languages, pick your argument against them

Python is the only language not well suited for implementing a compiler. And no, it is not systems level programming, but it is the level right before...

This thread is about planning for competitive memory management in D, not in other languages.

Is there a plan? No? Will there be a plan? Right now, it does not seem likely. People make suggestions all the time, but there is no signal of a likely path.

May 16, 2021

On Sunday, 16 May 2021 at 05:50:12 UTC, Mike Parker wrote:

>

You're acting as if D can only be used with the GC. dmd is written in D, so we are eating our own pudding. Walter is very focused on making compiles as fast as possible, and he determined that releasing memory and using the GC prevent him from meeting that goal. And he can still write his program in D.

The GC is not a panacea. It's not suitable for every use case or performance goal. And that's perfectly fine.

In practice that's the case, that you cannot avoid GC unless you don't use anything from the standard library. As soon you use arrays, associative arrays and many other data structures GC will be used.

There are two options as I see it. Either make the standard library not using any GC (likely manual memory management or some other method that is not tracing GC), or introduce fat pointers where GC objects are of a special type and the you can recompile the standard library to the GC type you want.

May 16, 2021

On Sunday, 16 May 2021 at 09:22:00 UTC, Ola Fosheim Grostad wrote:

>

Is there a plan? No? Will there be a plan? Right now, it does not seem likely. People make suggestions all the time, but there is no signal of a likely path.

On the topic of a plan (just in general, not specifically about DMD's memory usage):

  1. It begs the question of, how is it possible for anyone to even organise a cohesive plan coming forward, since it requires participation from both the leadership, core team, and also D's community (ideally).

  2. Even if we can bring a plan forward, how can the people in charge ensure that it is executed, especially in a timely manner?

  3. One problem I see all the time with discussions on D, we want to be everything yet end up as nothing; we want to please every single type of programmer, yet that leads to impossible situations, and compromise is for some reason always out of the question; we want everything to be perfect and work in every situation, but this often isn't achievable, and no compromise is made.

So with massive hurdles at any part of even getting people together to make a plan, let alone making the actual plan, what could the future look like for D's progression and evolution?

My best guess is: It'll be the same as always. DIPs that get rejected for not being completely perfect in every possible use-case (and god forbid it use the GC); features that appear out of nowhere; "glaring" or "fundamental" issues (for whomever is complaining that is) will be ignored and swept to the side with "fix it yourself/file an issue"; things just happen when they happen with no rhyme or method, and so on...

Even worse is when exciting things like newCTFE are teased for years, only to be dropped :(

Or things like DIP 1028(?) where mostly a single point of tension (all extern functions being @safe by default) causes all that hard work to go to waste.

Without a cohesive direction and a well-defined plan, I'd say it definitely increases the risk factor of someone/some entity adopting D for a project, due to the large uncertainty and confusion about what kind of language D is.

e.g. I wonder what people who keep on eye on D, but only on a casual level, would think once they learn about ImportC. I wonder if the feeling would be of confusion or excitement (or both!). I wonder if they'll get a feeling of an identity crisis of sorts.

e.g. We're a language with both a GC and @nogc, yet we don't provide many tools out of the box for @nogc code (containers?); we're becoming GC-phobic in regards to features and library changes; etc.

Then there's all this hub-bub about a possible Phobos v2/v2021/whatever, and it's like: will this actually lead anywhere, is there any actual intention to go through with this, or is this just a theoretical talking point for the next part of a decade?

idk, this was kind of just a ramble - I feel D's overall direction and management doesn't seem to be "keeping up with the times" so to speak. It's kind of like that "This is fine" meme in a way, I don't know how to describe it.

Uncertainly. Lack of clarity. Lack of obvious public direction and vision. What could happen to try and tackle these things (if they're even problems at all and it's not just me having bad observations)?