November 17, 2021
On Wednesday, 17 November 2021 at 05:51:48 UTC, forkit wrote:
> On Wednesday, 17 November 2021 at 02:08:22 UTC, H. S. Teoh wrote:
>>
>>
>> One day, the realization will dawn on decision-makers that using such an outdated language in today's radically different environment from the 70's when C was designed, is a needless waste of resources. And then the inevitable will happen.
>>
>>
>> T
>
> I don't accept your assertion that C is outdated.
>
> Is assembly outdated?
>
> C should be the closest to assembly you can get, and no closer. That is the value proposition of C, in 'modern times'.
>
> If your assertion was, instead, that there are problem domains where something other than C should be considered, then I would whole heartedly accept such an assertion. And in such domains, D is certainly worth evaluating.

Google and Linux kernel don't care about your opinion and have started introducing Rust into the Linux kernel.

AUTOSAR for high integrity computing doesn't care about your opinion and now mandates C++14 as the language to use on AUTOSAR certified software, ISO 26262 road vehicle functional safety.

Arduino folks don't care about your opinion and Arduino libraries are written in C++, they also have Rust and Ada collaborations.

C was closer to PDP-11 Assembly, it is hardly closer to any modern CPU.
November 17, 2021
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:
> On Wed, Nov 17, 2021 at 01:57:23AM +0000, zjh via Digitalmars-d wrote:
>> [...]
>
> Why bend over backwards to please the GC-phobic crowd?  They've already made up their minds, there's no convincing them.
>
> [...]

+1
November 17, 2021
On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

> And you know what?  In spite of all this time and effort, programmers get it wrong anyway -- typical C code is full of memory leaks, pointer bugs, and buffer overflows.

People write bugs. You don't say!

> Most of them are latent and only trigger in obscure environments and unusual inputs, and so lie undiscovered, waiting for the day it suddenly explodes on a customer's production environment.

GC isn't solving those issues. It's masking them. If you have a stale pointer somewhere, the bug *is* that the pointer is stale. Making it point to some forgotten piece of data is not a solution, it's a write off of a lost cause. Yeah, it's @safe, sure. Because you're leaking.

> Or somebody comes up with a security exploit...

Your susceptibility to security (and other) exploits grows proportional to the number of dependencies, of which druntime (and, consequently, GC) is one. So that's kind of a moot point.

> With a GC, you instantly eliminate 90% of these problems.  Only 10% of the time, you actually need to manually manage memory -- in inner loops and hot paths where it actually matters.

It's so funny how you keep talking about this mythical 90% vs 10% split. When you have 16ms *for everything*, and a single collection takes 8ms, during which the whole world is stopped, you can't have 90/10. When you can't actually disable the GC (and you can't *actually* disable the GC), you can't have 90/10, because eventually some forced collection *will* turn that 90 into 99. So, practically, it comes down to either you use the GC, or you don't, period. That is not the fault of GC per se, but it's the consequence of lack of control. Unfortunately, price of convenience sometimes is just too high.

> GC phobia is completely irrational and I don't see why we should bend over backwards to please that crowd.

Simply put? Because any decent API doesn't transfer its garbage to its callers. And because, surprise, sometimes deterministic run time is a requirement. If the cost of calling your function is 100 cycles now and a million at some poorly specified point in the future, I'd consider looking for something that would just take the million up front.
November 17, 2021

On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav Blinov wrote:

>

On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

>

[...]

People write bugs. You don't say!

[...]

What do you mean by "you can't actually disable the GC"? 🤔

November 17, 2021

On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn wrote:

>

On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav Blinov wrote:

>

On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

>

[...]

People write bugs. You don't say!

[...]

What do you mean by "you can't actually disable the GC"? 🤔

I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.

November 17, 2021

On Wednesday, 17 November 2021 at 07:25:45 UTC, Stanislav Blinov wrote:

>

On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn wrote:

>

On Wednesday, 17 November 2021 at 07:08:45 UTC, Stanislav Blinov wrote:

>

On Wednesday, 17 November 2021 at 02:32:21 UTC, H. S. Teoh wrote:

>

[...]

People write bugs. You don't say!

[...]

What do you mean by "you can't actually disable the GC"? 🤔

I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.

GC.disable() disables the running of a collection phase(and runs a collection only when even the OS returns Out of Memory error), but allows allocation.
@nogc disables all allocations via the GC.

How else do you propose we further formalize these two notions? GC should throw Error if even the OS returns OoM rather than run collection cycle?

November 17, 2021

On Wednesday, 17 November 2021 at 07:35:43 UTC, Tejas wrote:

>

On Wednesday, 17 November 2021 at 07:25:45 UTC, Stanislav Blinov wrote:

>

On Wednesday, 17 November 2021 at 07:16:35 UTC, Imperatorn wrote:

>

What do you mean by "you can't actually disable the GC"? 🤔

I mean the GC.disable/GC.enable. The spec needs to be WAY more strict and, well, specific, for those to be of use.

GC.disable() disables the running of a collection phase(and runs a collection only when even the OS returns Out of Memory error), but allows allocation.

That is not at all how it's documented. "Collections may continue to occur in instances where the implementation deems necessary for correct program behavior, such as during an out of memory condition."
IOW, it's basically allowed to collect anyway whenever, and not "only when even the OS returns OoM".

>

@nogc disables all allocations via the GC.

Not talking about @nogc.

>

How else do you propose we further formalize these two notions? GC should throw Error if even the OS returns OoM rather than run collection cycle?

If it's disabled - yes. Because "disable" should mean "disable", and not "disable, but..." Failing that, it should be renamed to "disableBut", and specify exactly when collections may still occur, and not wave it off as implementation-defined. I wouldn't want to depend on implementation details, would you?

November 17, 2021

On Tuesday, 16 November 2021 at 18:17:29 UTC, Rumbu wrote:

>

At least from my point of view, it seems that recently D made a shift from a general purpose language to a C successor, hence the last efforts to improve betterC and C interop, neglecting other areas of the language.

By other areas I mean half baked language built-ins or oop support which failed to evolve at least to keep the pace with the languages from where D took inspiration initially (e.g. Java and its successors).

In this new light, even I am not bothered by, I must admit that the garbage collector became something that doesn't fit in.

Now, without a gc, more than half of the language risks to become unusable and that's why I ask myself how do you see the future of the memory management in D?

For library development it is not necessary a big deal since the allocator pattern can be implemented for each operation that needs to allocate.

But, for the rest of the features which are part of the core language (e.g. arrays, classes, exceptions) what memory model do you consider that will fit in? Do you think that compiler supported ARC can be accepted as a deterministic memory model by everyone? Or memory ownership and flow analysis are better?

Not assuming a standard memory model can be a mistake, the C crowd will always complain that they cannot use feature X, others will complain that they cannot use feature Y because it is not finished or its semantics are stuck in 2000's.

D always aimed to be a C/C++ successor though. It had Java influences, but that's it, it always had the ambitions of being a systems programming language.

That it was flexible enough to even be good for scripting tasks was an accidental boon(and downfall:

  • since GC means no system programming; and instead of removing GC/embracing GC, we're stuck in-between, and
  • ARC seems to be a pipe dream for some reason;
  • front-end refactoring is considered infeasible and
  • code-breakage is only allowed as regression, ie, a bug that must be fixed).
November 17, 2021
On Wednesday, 17 November 2021 at 06:50:52 UTC, Paulo Pinto wrote:
>
> Google and Linux kernel don't care about your opinion and have started introducing Rust into the Linux kernel.
>
> AUTOSAR for high integrity computing doesn't care about your opinion and now mandates C++14 as the language to use on AUTOSAR certified software, ISO 26262 road vehicle functional safety.
>
> Arduino folks don't care about your opinion and Arduino libraries are written in C++, they also have Rust and Ada collaborations.
>
> C was closer to PDP-11 Assembly, it is hardly closer to any modern CPU.

Well, clearly those examples demonstate that my opinion has some merit ;-)

Also, many of C's so called problems, are really more library problems. You can't even do i/o in C without a library.

also. you kinda left out alot.... like all the problem domains where C is still the language of choice... even to this day.

I mean, even Go was originally written in C. It seems unlilkely they could have written Go, in Go.
November 17, 2021

On Wednesday, 17 November 2021 at 07:50:06 UTC, Stanislav Blinov wrote:

>

That is not at all how it's documented. "Collections may continue to occur in instances where the implementation deems necessary for correct program behavior, such as during an out of memory condition."
IOW, it's basically allowed to collect anyway whenever, and not "only when even the OS returns OoM".

Oops
Okay, that's not very nice :(

> >

@nogc disables all allocations via the GC.

Not talking about @nogc.

You can chose to ignore it, but I feel people will bring it up in these kinds of discussions anyways

> >

How else do you propose we further formalize these two notions? GC should throw Error if even the OS returns OoM rather than run collection cycle?

If it's disabled - yes. Because "disable" should mean "disable", and not "disable, but..." Failing that, it should be renamed to "disableBut", and specify exactly when collections may still occur, and not wave it off as implementation-defined. I wouldn't want to depend on implementation details, would you?

Yeah, me neither.

I think such harsh constraints aren't making it in the language spec anytime soon. If you want such guarantees then you'll have to stick to -betterC... but that is only supposed to be used as a transition technology, not a dev platform in its own right...

Yeah, D is a no-go for latency-sensitive stuff, I think(Maybe ARC + @weak pointers could help... but ARC itself seems far away, if even possible...)