October 03, 2020
On Saturday, 3 October 2020 at 14:10:34 UTC, Steven Schveighoffer wrote:
> This is a bit ridiculous. If I was trying to sell you a gun, which in 0.001% of cases explodes when you pull the trigger, and I say "Yeah that's true, but that almost never happens. You aren't even trying it!!!" does that make you feel better?

Probably is so difficult to reason about.

Example 1:

It is very improbable that you catch covid-19 if you are careful and conservative in your actions.

It is probable that you catch covid-19 if you regularly interact with other people that are not as careful as yourself, or if you simply isn't aware that they were recently on a long vacation trip.

E.g. larger teams, or library authors that might only use one compiler on one platform.

Example 2:

It is probable that you find the source of a bug in a well structured program (no code smell).

It is improbable that at program will remain well structured over time (older programs always smell).

October 03, 2020
On Saturday, 3 October 2020 at 14:10:34 UTC, Steven Schveighoffer wrote:
> What happens is, I write code that assumes some `in` parameter won't change

That code is kinda already buggy since in just means you won't change it, but somebody else might.

I know it is weird when looking at something that is typically a value copy, but in is still based on const, not immutable, so you must keep some expectation that it might be changed by someone else. (BTW speaking of ref optimizations, any immutable could prolly be passed by reference implicitly as well.....)

But maybe like you said later, the spec should say any `in` is treated at the language level as a `ref` (except for rvalue issues of course) just optimized to value in defined ABI places. That'd probably be good enough.

October 03, 2020
On 10/3/20 10:10 AM, Steven Schveighoffer wrote:
> Another option is to tag an `in` value as "I'm OK if this is passed by reference". like `in ref` or something (that still binds to rvalues). Then an optimization can say "actually, I'm going to pass this by value", and nobody cares. This is different in that you have to *declare* you're ok with a reference, not the other way around.

Thinking about this more, it seems my major problem is that `in` as it is without the switch is not a reference.

`in ref` is a reference, and it's OK if we make this not a reference in practice, because it's const. And code that takes something via `in ref` can already expect possible changes via other references, but should be OK if it doesn't change also.

Can we just change -preview=in so it affects `in ref` instead of `in`?

-Steve
October 03, 2020
On Saturday, 3 October 2020 at 14:48:48 UTC, Adam D. Ruppe wrote:
> On Saturday, 3 October 2020 at 14:10:34 UTC, Steven Schveighoffer wrote:
>> What happens is, I write code that assumes some `in` parameter won't change
>
> That code is kinda already buggy since in just means you won't change it, but somebody else might.
>
> I know it is weird when looking at something that is typically a value copy, but in is still based on const, not immutable, so you must keep some expectation that it might be changed by someone else. (BTW speaking of ref optimizations, any immutable could prolly be passed by reference implicitly as well.....)
>
> But maybe like you said later, the spec should say any `in` is treated at the language level as a `ref` (except for rvalue issues of course) just optimized to value in defined ABI places. That'd probably be good enough.

The real problem here is that existing code, which was developed and tested under one set of assumptions, will now have those assumptions silently changed underneath it, in a way that is impossible to detect until and unless it manifests as a bug. (If this sounds familiar to anyone, it may be because it's the same issue that @safe-by-default had with extern(C) functions.)

If the decision is ever made to make `-preview=in` the default, the existing meaning of `in` should first be deprecated, and eventually made into an error.
October 03, 2020
On Saturday, 3 October 2020 at 14:48:48 UTC, Adam D. Ruppe wrote:
> On Saturday, 3 October 2020 at 14:10:34 UTC, Steven Schveighoffer wrote:
>> What happens is, I write code that assumes some `in` parameter won't change
>
> That code is kinda already buggy since in just means you won't change it, but somebody else might.

How come? I thought the current (modern) D semantics is that if you cast away shared the compiler can assume that no other contexts (threads/IRQs) modifies the object?

October 03, 2020
On Saturday, 3 October 2020 at 14:49:03 UTC, Steven Schveighoffer wrote:
> `in ref` is a reference, and it's OK if we make this not a reference in practice, because it's const. And code that takes something via `in ref` can already expect possible changes via other references, but should be OK if it doesn't change also.

You either support aliasing or not.

If you support aliasing then you should be able to write code where aliasing has the expected outcome.

Let me refer to ADA. According to the ADA manual you can specify that an integer is aliased, that means that it is guaranteed to exist in memory (and not in a register). Then you use 'access' to reference it.

If a language construct says "ref" I would expect 100% support for aliasing. It is not like aliasing is always undesired.

October 03, 2020
On Saturday, 3 October 2020 at 15:07:05 UTC, Paul Backus wrote:
> If the decision is ever made to make `-preview=in` the default, the existing meaning of `in` should first be deprecated, and eventually made into an error.

That has already been in progress over this last year, that's why `in` is now eligible for modifications.

The meaning has changed twice in the last year. One of them caused non-complying (already broken) code to fail to compile, so a second change came to ease up on it, but this was never going to be a permanent solution. The -preview switch's purpose is to see what final form it is going to take.

There probably will be a formal deprecation of it over the following year before the -preview is actually solidified. But the changes are already in flux.
October 03, 2020
On Saturday, 3 October 2020 at 14:49:03 UTC, Steven Schveighoffer wrote:
> `in ref` is a reference, and it's OK if we make this not a reference in practice, because it's const. And code that takes something via `in ref` can already expect possible changes via other references, but should be OK if it doesn't change also.

Is that still OK in a concurrent or multithreaded context?

    void foo (in ref bar)
    {
        // does something which may yield, and
        // another context can change value
        // underlying `bar`
        ...

        // result of this writeln will now depend
        // on how the compiler treated the `in ref`
        writeln(bar)
    }
October 03, 2020
On Saturday, 3 October 2020 at 15:21:47 UTC, Ola Fosheim Grøstad wrote:
> How come? I thought the current (modern) D semantics is that if you cast away shared the compiler can assume that no other contexts (threads/IRQs) modifies the object?

I don't know the rules for shared... I don't think anyone does.

But the rule for const vs immutable is well known. Passing the same thing const on one side and mutable on the other doesn't break const, even though it changes - that's exactly expected.
October 03, 2020
On Saturday, 3 October 2020 at 00:35:24 UTC, Guillaume Piolat wrote:
> On Friday, 2 October 2020 at 23:03:49 UTC, Ola Fosheim Grøstad wrote:
>>
>> I have no issues with undefined behaviour as long as it is easy to understand and explain, like requiring 'in' params to be nonaliased in the function body. It has to be easy to grok and remember.
>
> Or the even simpler rule: don't use 'in' :)

Actually, if "in" implies non-aliased you might want to use it for DSP buffers because modern backend can then more easily generate SIMD code when the buffer is non-aliased. :-)