October 05, 2020
On Monday, 5 October 2020 at 02:46:42 UTC, Andrei Alexandrescu wrote:
> On 10/4/20 10:19 AM, Iain Buclaw wrote:
>> On Saturday, 3 October 2020 at 05:02:36 UTC, Andrei Alexandrescu wrote:
>>> Who approved this? How in the world did a group of competent,
>>> well-intended people, looked at this and said - yep, good idea. Let's."
>>>
>>> ???
>>>
>> 
>> *You* approved it.
>> 
>> https://github.com/dlang/dmd/pull/11000#issuecomment-675605193
>
> Oi. Touché.
>

I know the feeling all too well.  I have on a few occasions ran into issues in the D front-end (from GDC) and asked "who on Earth wrote or approved this?", only to discover that it was Me, two months ago. :-)

>>> Please, we really need to put back the toothpaste in the tube here. I could on everybody's clear head here to reconsider this.
>> 
>> Frankly, I think you are making a mountain out of a molehill here.  You are imagining a problem that doesn't exist; and if one does find an issue, the fault lies with the DMD compiler and not the D language specification.  Though evidently having clearer wording in the spec benefits all.
>
> I think my STL examples have put the narrative that confusing aliasing is rare to rest.
>

The spec as is currently written does seem to be wide open to interpretation.  But I think that trying to reason 'in' in terms of 'ref' and 'restrict' semantics should be left at the door.

Any thought-problems that arise from aliasing is hard to justify in my view because in practice I just can't see D having strict aliasing rules so long as it continues to not be enforced in some way.

Actually, I think there is zero mention of aliasing in the language spec, so the following can only be interpreted as being valid and precisely defined to work in D.
---
float f = 1.0;
bool *bptr = cast(bool*)&f;
bptr[2] = false;
assert(f == 0.5);
---
If this gets addressed, then we can use aliasing rules as a measure for how we treat -preview=in.  If you are interested in defining some aliasing rules for D, I'd be more than happy to spin off a new thread to discuss them, and I will implement that in GDC and report back the success/failures of applying such rules. :-)

>> If you read nothing more of this reply, at least finish up until the end of this paragraph.  Please hold fire until GDC and LDC have implemented this feature, then we can discuss the pitfalls that we've encountered with it.  Basing decisions on behaviors observed with DMD is not the right approach, and if you are currently finding the situation to be a mess, it is a mess of DMD's own doing.
>
> Implementation details of dmd are not of concern here, and in fact the more different ldc/gdc/dmd are from one another, the more problematic the entire matter is.
>

As this is an experimental feature, a bit of deviation in implementations can be seen as a good thing.  Convergence can come later once we work out just who has got it right.

Correct me if I'm wrong, but it looks like we'll have three competing implementations:

DMD: `const scope`, with `ref` applied on types usually passed in memory.
LDC: `const scope ref @restrict`
GDC: `const scope` with `ref` applied on types usually passed by invisible reference.
October 05, 2020
On Monday, 5 October 2020 at 07:55:57 UTC, Walter Bright wrote:
> On 10/4/2020 7:19 AM, Iain Buclaw wrote:
>> I've also skimmed past a passing concern that the ABI between compilers would be different.  Well, let me rest assure you that DMD, GDC and LDC have never been compatible in the first place, so there's no point worrying about that now.
>
> The problem is not dmd compiled code calling gdc/ldc compiled code. The problem is code that works with one compiler fails with another, because the "to ref or not to ref" decision is *implementation defined*.

Granted this is a new preview feature, I can only see it being healthy if each vendor tries something different and reports back on how much success they had with it.

So let other vendors implement as they interpret the spec, and we can converge later based on success/failings of our given decisions.  I imagine we are 10 releases of DMD away from even considering whether or not this should come out of `-preview`.
October 05, 2020
On Monday, 5 October 2020 at 07:55:57 UTC, Walter Bright wrote:
> On 10/4/2020 7:19 AM, Iain Buclaw wrote:
>> I've also skimmed past a passing concern that the ABI between compilers would be different.  Well, let me rest assure you that DMD, GDC and LDC have never been compatible in the first place, so there's no point worrying about that now.
>
> The problem is not dmd compiled code calling gdc/ldc compiled code. The problem is code that works with one compiler fails with another, because the "to ref or not to ref" decision is *implementation defined*.

Furthermore, NRVO is implementation defined, which results in the same dmd/gdc/ldc making different decisions for "to ref or not to ref", and yet I see no one complaining about that.
October 05, 2020
On Monday, 5 October 2020 at 08:34:37 UTC, Iain Buclaw wrote:
> On Monday, 5 October 2020 at 07:55:57 UTC, Walter Bright wrote:
>> On 10/4/2020 7:19 AM, Iain Buclaw wrote:
>>> I've also skimmed past a passing concern that the ABI between compilers would be different.  Well, let me rest assure you that DMD, GDC and LDC have never been compatible in the first place, so there's no point worrying about that now.
>>
>> The problem is not dmd compiled code calling gdc/ldc compiled code. The problem is code that works with one compiler fails with another, because the "to ref or not to ref" decision is *implementation defined*.
>
> Furthermore, NRVO is implementation defined, which results in the same dmd/gdc/ldc making different decisions for "to ref or not to ref", and yet I see no one complaining about that.

I cannot resist the bait: https://issues.dlang.org/show_bug.cgi?id=20752
October 05, 2020
On 10/4/2020 11:37 PM, Iain Buclaw wrote:
> I don't think __restrict__ is a good way to reason with expected behaviour.  The spec only makes three things clear: No clobber; No escape; Copy elision if possible.
> 
> In is not ref, and is not restrict.  In is in - a value goes in and doesn't come out.

https://dlang.org/spec/function.html#parameters says:

"in The parameter is an input to the function. Input parameters behaves as if they have the const scope storage classes. Input parameters may be passed by reference by the compiler. Unlike ref parameters, in parameters can bind to both lvalues and rvalues (such as literals). Types that would trigger a side effect if passed by value (such as types with postblit, copy constructor, or destructor), and types which cannot be copied, e.g. if their copy constructor is marked as @disable, will always be passed by reference. Dynamic arrays, classes, associative arrays, function pointers, and delegates will always be passed by value, to allow for covariance. If the type of the parameter does not fall in one of those categories, whether or not it is passed by reference is implementation defined, and the backend is free to choose the method that will best fit the ABI of the platform."

The salient points are "may be passed by reference", and "whether or not it is passed by reference is implementation defined". The trouble with passing by reference is when there are other live mutable references to the same memory object. Whether mutating through those references mutates the in argument is "implementation defined".

That's the problem with `in`. It's not an issue of a shortcoming in DMD.
October 05, 2020
On 10/5/2020 1:34 AM, Iain Buclaw wrote:
> Furthermore, NRVO is implementation defined, which results in the same dmd/gdc/ldc making different decisions for "to ref or not to ref", and yet I see no one complaining about that.

I don't see that as a to ref or not decision. It is an issue of how many copies are made, there shouldn't be dangling references to those copies, or it shouldn't NRVO it.

I don't recall any case of NRVO breaking code since I invented it 30 years ago, other than something that relied on the number of copies.
October 05, 2020
On 10/5/2020 1:42 AM, Mathias LANG wrote:
> I cannot resist the bait: https://issues.dlang.org/show_bug.cgi?id=20752

And you shouldn't. If there's a memory corruption issue in the language, file a bug report.
October 05, 2020
On 05.10.20 09:56, Iain Buclaw wrote:
> 
> 
> Correct me if I'm wrong, but it looks like we'll have three competing implementations:
> 
> DMD: `const scope`, with `ref` applied on types usually passed in memory.
> LDC: `const scope ref @restrict`
> GDC: `const scope` with `ref` applied on types usually passed by invisible reference.

Weren't there different rules for non-POD types? Do those differ between backends?

Also, it looks like in this case `__traits(isRef, ...)` will return different results with different compiler backends and aliasing `in` parameters is UB with LDC? Is `@restrict` wrongly treated as `@safe` or does the `@safe`ty of `in` parameters depend on the compiler backend?
October 05, 2020
On Monday, 5 October 2020 at 00:31:33 UTC, kinke wrote:
> On Sunday, 4 October 2020 at 23:40:29 UTC, Ola Fosheim Grøstad wrote:
>> It might help some if compilers would run unit tests 3 times with different 'in' implementations.
>>
>> 1 mixed value/ref
>> 2 value
>> 3 ref
>
> I was about to propose something like this (restricted to POD types), possibly augmented by some indeterministic fuzzing. Code ported to the new `in` semantics could then even show some previously unintended aliasing issues; getting to the root of the problem would probably still be non-trivial though.
>
> A compiler mode enforcing by-ref, coupled with some sort of runtime sanitizer detecting invalid writes to live in-params, would probably be a very valuable tool for validation & troubleshooting.

You could do this: make it pass by value as the default, than add compiler switches that have increasingly aggressive optimizations up to assuming no aliasing.

That ought to be innocent enough.
October 05, 2020
On Monday, 5 October 2020 at 07:55:57 UTC, Walter Bright wrote:
>
> The problem is not dmd compiled code calling gdc/ldc compiled code. The problem is code that works with one compiler fails with another, because the "to ref or not to ref" decision is *implementation defined*.

Ouch, so this mean that how 'in' works must be a defined standard that all D compilers adhere to. Also this standard must be defined for each CPU architecture.