August 20, 2020
On Friday, 31 July 2020 at 21:49:25 UTC, Mathias LANG wrote:
> https://github.com/dlang/dmd/pull/11000

1. Deprecation of `in ref` makes no sense. Why? I assume it's due to a bug in the proposed change. How the compiler should know that the argument should be passed by ref? I doesn't necessarily know how to load the argument, it may have alignment and synchronization requirements. And more importantly how the programmer can know whether the argument is passed by ref, now that it varies by platform?
2. Dependence on calling convention. AIU ref semantics depends on parameter position?
3. Runtime hooks don't go through semantic checks. Is this a theoretical concern or did you introduce some new behavior that causes problem with this?
August 20, 2020
On Thursday, 20 August 2020 at 15:59:24 UTC, Kagamin wrote:
> On Friday, 31 July 2020 at 21:49:25 UTC, Mathias LANG wrote:
>> https://github.com/dlang/dmd/pull/11000
>
> 1. Deprecation of `in ref` makes no sense. Why? I assume it's due to a bug in the proposed change.

I'm not in the business of deprecating something to accommodate for a broken implementation, no. The implementation originally allowed `in ref`, but after some tinkering and looking at people's usages, my opinion is that it would be better to just allow `in`.

> How the compiler should know that the argument should be passed by ref?
> I doesn't necessarily know how to load the argument, it may have alignment and synchronization requirements.

As explained in the PR, and in the changelog, the compiler knows by inspecting the type. If the type has elaborate copy or destruction, IOW, if copying it would have side effects, it will always pass it by ref to avoid those side effects.
Otherwise, it asks the backend. The current rule in DMD is for types that are over twice the size of a register to be passed by ref (or real).

I can't think of a situation where the compiler doesn't know how to load the argument. If you're talking about opaque types, those are rejected.

> And more importantly how the programmer can know whether the argument is passed by ref, now that it varies by platform?

I don't understand why it would be "more important". The point of `in` parameter is that it does the right thing for parameters which are read-only and won't escape the scope of the function. It doesn't matter to the user whether your parameter is `ref` or not if it is `scope const`, because you can't modify it anyway. It only matters if passing it by value would be expensive (e.g. large static array) or have side effects (e.g. a destructor).


> 2. Dependence on calling convention. AIU ref semantics depends on parameter position?

Yes. Originally didn't, but that was the main feedback I got, that it should be done at the function level instead of the parameter (type) level.

> 3. Runtime hooks don't go through semantic checks. Is this a theoretical concern or did you introduce some new behavior that causes problem with this?

Just to be clear: When I said "runtime hook", I meant "the AST which the compiler generate to call C functions in druntime". It generates the equivalent of a prototype and call that. It's not a big deal, and I found a way around.
August 20, 2020
On Friday, 31 July 2020 at 21:49:25 UTC, Mathias LANG wrote:
>
> B) It makes `in` take the effect of `ref` when it makes sense. It always pass something by `ref` if the type has elaborate construction / destruction (postblit, copy constructor, destructors). If the type doesn't have any of those it is only passed by `ref` if it cannot be passed in register. Some types (dynamic arrays, probably AA in the future) are not affected to allow for covariance (more on that later). The heuristics there still need some small improvements, e.g. w.r.t. floating points (currently the heuristic is based on size, and not asking the backend) and small struct slicing, but that should not affect correctness.
>

This is interesting on a general level as well and true for several programming languages. Let the compiler optimize the parameter passing unless the programmer explicitly ask for a certain way (copy object, pointer/reference etc.). This is very unusual and if you have a language that optimizes the parameter passing by default, please mention it because it would be interesting.

I'm all for that 'in' can be used for a "optimized const parameter" where the compiler optimizes as it wants. However, the question is if we want to override the default behaviour per object basis? As mentioned you might want to have another default behaviour for arrays but that can be true for other storage structures as well so this should really be user defined.

August 20, 2020
On Thursday, 20 August 2020 at 17:31:17 UTC, IGotD- wrote:
> This is interesting on a general level as well and true for several programming languages. Let the compiler optimize the parameter passing unless the programmer explicitly ask for a certain way (copy object, pointer/reference etc.). This is very unusual and if you have a language that optimizes the parameter passing by default, please mention it because it would be interesting.
>

Nim does this and I took the feature from Ada. You can override the behavior with pragmas but I've only seen that done for C interop, not for optimizations as the compiler always seems to get it right.
August 20, 2020
On 8/20/20 1:31 PM, IGotD- wrote:
> This is interesting on a general level as well and true for several programming languages. Let the compiler optimize the parameter passing unless the programmer explicitly ask for a certain way (copy object, pointer/reference etc.).

This has been discussed a few times. If mutation is allowed, aliasing is a killer:

void fun(ref S a, const compiler_chooses S b) {
    ... mutate a, read b ...
}

S x;
fun(x, x); // oops

The problem now is that the semantics of fun depends on whether the compiler chose pass by value vs. pass by reference.
August 21, 2020
On Thursday, 20 August 2020 at 22:19:16 UTC, Andrei Alexandrescu wrote:
> On 8/20/20 1:31 PM, IGotD- wrote:
>> This is interesting on a general level as well and true for several programming languages. Let the compiler optimize the parameter passing unless the programmer explicitly ask for a certain way (copy object, pointer/reference etc.).
>
> This has been discussed a few times. If mutation is allowed, aliasing is a killer:
>
> void fun(ref S a, const compiler_chooses S b) {
>     ... mutate a, read b ...
> }
>
> S x;
> fun(x, x); // oops
>
> The problem now is that the semantics of fun depends on whether the compiler chose pass by value vs. pass by reference.

True but in practice it doesn't happen very often. The benefits far outweigh this minor downside. Plus there are known ways to prevent this form of aliasing at compile-time.
August 21, 2020
On Thursday, 20 August 2020 at 22:19:16 UTC, Andrei Alexandrescu wrote:
> On 8/20/20 1:31 PM, IGotD- wrote:
>> This is interesting on a general level as well and true for several programming languages. Let the compiler optimize the parameter passing unless the programmer explicitly ask for a certain way (copy object, pointer/reference etc.).
>
> This has been discussed a few times. If mutation is allowed, aliasing is a killer:
>
> void fun(ref S a, const compiler_chooses S b) {
>     ... mutate a, read b ...
> }
>
> S x;
> fun(x, x); // oops
>
> The problem now is that the semantics of fun depends on whether the compiler chose pass by value vs. pass by reference.

Isn't that what Walter's OB system is supposed to address ?
August 21, 2020
On Friday, 31 July 2020 at 21:49:25 UTC, Mathias LANG wrote:
> [...]

how does the ABI work for this for extern(C) and others? Will it mean ref or not ref / will that translate to a pointer or not?

For example in Win32 COM IDL files I often see [in] used for input only parameters, can we annotate parameters in D like that without changing calling semantics to document that they are input only parameters and potentially allow optimizations?
August 21, 2020
On Thursday, 20 August 2020 at 17:25:43 UTC, Mathias LANG wrote:
> On Thursday, 20 August 2020 at 15:59:24 UTC, Kagamin wrote:
>> On Friday, 31 July 2020 at 21:49:25 UTC, Mathias LANG wrote:
>>> https://github.com/dlang/dmd/pull/11000
>>
>> 1. Deprecation of `in ref` makes no sense. Why? I assume it's due to a bug in the proposed change.
>
> I'm not in the business of deprecating something to accommodate for a broken implementation, no. The implementation originally allowed `in ref`, but after some tinkering and looking at people's usages, my opinion is that it would be better to just allow `in`.

It needlessly degrades language and breaks code and shouldn't be done. Didn't you write this pull because you believe `in ref` is useful?

> I can't think of a situation where the compiler doesn't know how to load the argument. If you're talking about opaque types, those are rejected.
>
>> And more importantly how the programmer can know whether the argument is passed by ref, now that it varies by platform?
>
> I don't understand why it would be "more important". The point of `in` parameter is that it does the right thing for parameters which are read-only and won't escape the scope of the function. It doesn't matter to the user whether your parameter is `ref` or not if it is `scope const`, because you can't modify it anyway. It only matters if passing it by value would be expensive (e.g. large static array) or have side effects (e.g. a destructor).

I mean things like
int atomicLoad(in ref shared int n);
int loadAligned(in ref byte[4] n);
When the argument should be passed by ref by programmer's intent and should be communicated to the compiler, because the compiler isn't that smart.

>> 2. Dependence on calling convention. AIU ref semantics depends on parameter position?
>
> Yes. Originally didn't, but that was the main feedback I got, that it should be done at the function level instead of the parameter (type) level.

Doesn't this defeat your optimization when passing by value is expensive?
August 21, 2020
On Friday, 21 August 2020 at 09:37:37 UTC, WebFreak001 wrote:

> For example in Win32 COM IDL files I often see [in] used for input only parameters, can we annotate parameters in D like that without changing calling semantics to document that they are input only parameters and potentially allow optimizations?

You can attach a UDA. It can't be called "in", but perhaps "input".

--
/Jacob Carlborg