December 27, 2022
On 12/27/2022 1:41 AM, Max Samukha wrote:
> If T.init is supposed to be an invalid value useful for debugging, then variables initialized to that value... are not initialized.

It depends on the designed of the struct to decide on an initialized value that can be computed at compile time. This is not a failure, it's a positive feature. It means struct instances will *never* be in a garbage state.

C++ does it a different way, not a better way.
December 27, 2022
On 12/27/2022 3:32 AM, Dukc wrote:
> The `.init` value is supposed to be both. A null pointer is a good example. It is valid in the sense it's behaviour is reliable. Dereferencing it always crashes the program, as opposed to undefined behaviour. Also it will reliably say yes when compared to another null pointer.
> 
> But it is also an useful value for debugging, because accidently using it immediately crashes and produces a core dump, making it obvious we had a null where there shouldn't be one. Also when debugging, pointer to address `0x0000_0000_0000_0000` is clearly uninitialised, while a pointer to whatever happens might look like it's pointing to something valid.

D's positive initialization also ensures that instances will not be initialized with random garbage.
December 27, 2022
On 12/27/2022 11:12 AM, Max Samukha wrote:
> Yeah, but in case of an int, you never can tell whether the programmer wanted to initialize it to 0 or forgot to initialize it.

It's better than C++'s approach of default initializing it with a random bit pattern.

December 28, 2022
On Tuesday, 27 December 2022 at 22:54:59 UTC, Walter Bright wrote:
> 
> ...
> D's positive initialization also ensures that instances will not be initialized with random garbage.

and it may not be 'random garage'. it could be your privateKey.

and you don't want to be sending your unintialised buffer over the network with your privateKey still in it...

It is very difficult to argue against init-by-default, in any language.

Thankfully D provides an opt-out -> '.. = void'; (except for pointers in @safe mode).

But I would never support default-init for stack/heap allocations in C. How would I then discover your privateKey!?!?
December 28, 2022
On 12/27/22 23:53, Walter Bright wrote:
> This is not a failure, it's a positive feature.

To some extent. Aspects of this have been lovingly nicknamed the "billion dollar mistake".

> It means struct instances will *never* be in a garbage state. 

(Memory corruption.)
December 29, 2022
On 12/28/2022 1:33 AM, Timon Gehr wrote:
> On 12/27/22 23:53, Walter Bright wrote:
>> This is not a failure, it's a positive feature.
> 
> To some extent. Aspects of this have been lovingly nicknamed the "billion dollar mistake".

I don't agree with that assessment at all. Having a seg fault when your program enters an unanticipated, invalid state is a *good* thing. The *actual* billion dollar mistake(s) in C are:

1. uninitialized data leading to undefined behavior

2. no way to do array buffer overflow detection

because those lead to malware and other silent disasters.

And it's good to have a state that a memory object can be initialized too that cannot fail.

December 29, 2022
On Thursday, 29 December 2022 at 20:38:23 UTC, Walter Bright wrote:
> I don't agree with that assessment at all. Having a seg fault when your program enters an unanticipated, invalid state is a *good* thing. The *actual* billion dollar mistake(s) in C are:

The alternative is the language could have prevent this state from being unanticipated at all, e.g. nullable vs not null types.

December 29, 2022
On 12/29/22 21:38, Walter Bright wrote:
> On 12/28/2022 1:33 AM, Timon Gehr wrote:
>> On 12/27/22 23:53, Walter Bright wrote:
>>> This is not a failure, it's a positive feature.
>>
>> To some extent. Aspects of this have been lovingly nicknamed the "billion dollar mistake".
> 
> I don't agree with that assessment at all. Having a seg fault when your program enters an unanticipated, invalid state

The bad thing is allowing programs to enter unanticipated, invalid states in the first place...


> is a *good* thing. The *actual* billion dollar mistake(s) in C are:
> 
> 1. uninitialized data leading to undefined behavior
> 
> 2. no way to do array buffer overflow detection
> 
> because those lead to malware and other silent disasters.
> ...

Not all disasters are silent. Maybe you are biased because you only write batch programs that are intended to implement a very precise spec.
December 29, 2022
On Thursday, 29 December 2022 at 20:38:23 UTC, Walter Bright wrote:
>
> ..... The *actual* billion dollar mistake(s) in C are:
>
> 1. uninitialized data leading to undefined behavior
>
> 2. no way to do array buffer overflow detection
>
> because those lead to malware and other silent disasters.
>
> And it's good to have a state that a memory object can be initialized too that cannot fail.

I would argue, the billion dollar mistakes are really the fault of the users of the C programming language, and not the language itself.

Those sames users can make billion dollar mistakes in any language. Perhaps, not those particular ones you mentioned, but others. Even in the most safest language possible, a programmer could leave an API exposed, that wasn't meant to be exposed...

The programmer can actually do runtime bounds checking in C. e.g. Create your own vector type with bounds checking.

The programmer can also initialise everything to a known state in C. One could also use calloc instead of malloc, or create a their own memory allocator.

The C standard library didn't help either. It too could have been designed in a more memory safe manner. But like C itself, it is minimal, perfomance oriented, and not designed to get in your way and make things difficult for you.

Even if C did all these things for you, and more, it's likely C programmers would have found a way to remove them, turn them off, created their own vector that doesn't do bound checking, create their own memory allocater that doesn't initiaslise its allocations ...

e.g  -release -noboundscheck  .. sound familiar?
December 29, 2022
On 12/29/2022 12:45 PM, Adam D Ruppe wrote:
> The alternative is the language could have prevent this state from being unanticipated at all, e.g. nullable vs not null types.

It can't really prevent it. What happens is people assign a value, any value, just to get it to compile. I've seen it enough to not encourage that practice.

If there are no null pointers, what happens to designate a leaf node in a tree? An equivalent "null" object is invented. Nothing is really gained.

Null pointers are an excellent debugging tool. When a seg fault happens, it leads directly to the mistake with a backtrace. The "go directly to jail, do not pass go, do not collect $200" nature of what happens is good. *Hiding* those errors happens with non-null pointers.

Initialization with garbage is terrible. I've spent days trying to find the source of those bugs.

Null pointer seg faults are as useful as array bounds overflow exceptions.

NaNs are another excellent tool. They enable, for example, dealing with a data set that may have unknown values in it from bad sensors. Replacing that missing data with "0.0" is a very bad idea.