December 26, 2022
On 26/12/2022 2:13 PM, areYouSureAboutThat wrote:
> I'm unaware of any other programming language that has been more successful in doing just that.

Ooo oo, I can name two!

Basic, almost every micro computer had it come with it by default in ROM.

And C.

C pre/post putting parameters into function parameter list is a pretty different feel and I believe it had some semantic flow on effects from what I saw in a 1970's era C compiler which was dead simple.
December 25, 2022
On 12/25/2022 5:19 PM, Richard (Rikki) Andrew Cattermole wrote:
> C pre/post putting parameters into function parameter list is a pretty different feel and I believe it had some semantic flow on effects from what I saw in a 1970's era C compiler which was dead simple.

You're referring to function prototyping, a C++ feature which was backported to C.

I programmed in C a lot before function prototyping. The errors were rampant and very difficult to find. C compilers first started adding prototying as an extension, and it grew so popular it had to be put in the Standard.

C has had a number of fundamental improvements from K+R C that dramatically reduced the incidence of bugs. Function prototyping was probably the biggest.

Another big problem was sign preserving vs value preserving integer promotion. That made C unportable, and the community was evenly split between the two. Sign preserving won basically by fiat, and everyone else had to change their code.
December 26, 2022
On Monday, 26 December 2022 at 04:31:38 UTC, Walter Bright wrote:
> ...

well, to paraphrase Andrew Koenig...
(and taking much liberty in doing so.. i.e going beyond what he actually said):


---

Many programmers hesitate to program in C for fear of getting it wrong.

Others are fearless .. and do get it wrong.

---

December 26, 2022
C uninitialized variables was another fountain of endless and hard to track down problems. D initializes them by default for a very good reason.
December 27, 2022
On Tuesday, 27 December 2022 at 00:38:33 UTC, Walter Bright wrote:
> C uninitialized variables was another fountain of endless and hard to track down problems. D initializes them by default for a very good reason.

Is it initialization or "branding"? Honestly, I've never been comfortable with D's initialization semantics.

On the one hand, the language postulates that, for any type T, there is always a valid value T.init. On the other hand, it makes that weird distinction between initialization and "branding". That has always felt a bit schizophrenic.

If T.init is supposed to be a valid value, then the constructor receives an already initialized object (not some "branded" abberant), so the constructor actually plays the role of assignment.

If T.init is supposed to be an invalid value useful for debugging, then variables initialized to that value... are not initialized.

IMO, D's attempt to conflate those two meanings of T.init is a failed experiment. Even C++ seems to be going in a right direction (https://youtu.be/ELeZAKCN4tY?t=4887) in that respect.
December 27, 2022

On Tuesday, 27 December 2022 at 09:41:59 UTC, Max Samukha wrote:

>

If T.init is supposed to be a valid value, then the constructor receives an already initialized object (not some "branded" abberant), so the constructor actually plays the role of assignment.

If T.init is supposed to be an invalid value useful for debugging, then variables initialized to that value... are not initialized.

The .init value is supposed to be both. A null pointer is a good example. It is valid in the sense it's behaviour is reliable. Dereferencing it always crashes the program, as opposed to undefined behaviour. Also it will reliably say yes when compared to another null pointer.

But it is also an useful value for debugging, because accidently using it immediately crashes and produces a core dump, making it obvious we had a null where there shouldn't be one. Also when debugging, pointer to address 0x0000_0000_0000_0000 is clearly uninitialised, while a pointer to whatever happens might look like it's pointing to something valid.

December 27, 2022
On Mon, Dec 26, 2022 at 04:38:33PM -0800, Walter Bright via Digitalmars-d wrote:
> C uninitialized variables was another fountain of endless and hard to track down problems. D initializes them by default for a very good reason.

C's manual memory management is another fountain of endless hard to debug pointer bugs and pernicious memory problems.  Manual memory management requires absolute consistency and utmost precision, two things humans are notoriously bad at.  In spite of some people having knee-jerk reactions to the GC, D having one has been a big saver of headaches in terms of the amount of time and effort spent writing and debugging memory management code.


T

-- 
Дерево держится корнями, а человек - друзьями.
December 27, 2022

On Tuesday, 27 December 2022 at 11:32:51 UTC, Dukc wrote:

>

On Tuesday, 27 December 2022 at 09:41:59 UTC, Max Samukha wrote:

>

The .init value is supposed to be both. A null pointer is a good example. It is valid in the sense it's behaviour is reliable. Dereferencing it always crashes the program, as opposed to undefined behaviour. Also it will reliably say yes when compared to another null pointer.

I'd say it is invalid, but using it results in deterministic behavior. Hence "invalid but good for debugging".

>

But it is also an useful value for debugging, because accidently using it immediately crashes and produces a core dump, making it obvious we had a null where there shouldn't be one. Also when debugging, pointer to address 0x0000_0000_0000_0000 is clearly uninitialised, while a pointer to whatever happens might look like it's pointing to something valid.

Yeah, but in case of an int, you never can tell whether the programmer wanted to initialize it to 0 or forgot to initialize it.

December 27, 2022
On Tuesday, 27 December 2022 at 00:38:33 UTC, Walter Bright wrote:
> C uninitialized variables was another fountain of endless and hard to track down problems. D initializes them by default for a very good reason.

Yes, D can certainly claim to have better strategies than C to 'reduce the number of weaknesses that occur in software'. This is a good thing, surely.

https://cwe.mitre.org/data/definitions/1337.html

But in the end, C (as you know of course) operates at a low level of abstraction, and does so on purpose, and therefore such mitigation strategies are not consistent with the spirit and design goals of C.

Nobody (as far as i know) works on trying to create a better assembly. It is what it is.

Why does everyone want to create a better C? Well, they don't really. What they really want to do, is reduce programming errors by constantly raising the level of abstraction.

I like initialised variables in D. I wouldn't like them in C. It would feel like I've lost control. And in C, it should always be me who is in control (otherwise I'd have to revert to assembly).
December 27, 2022
On Tuesday, 27 December 2022 at 21:39:36 UTC, areYouSureAboutThat wrote:

> Nobody (as far as i know) works on trying to create a better assembly. It is what it is.

There was Randall Hyde’s HLA. Abandoned long ago, though.
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19