June 22, 2020
On 6/22/20 10:33 AM, Steven Schveighoffer wrote:
> On 6/22/20 10:20 AM, Paul Backus wrote:
>> On Monday, 22 June 2020 at 12:15:39 UTC, Steven Schveighoffer wrote:
>>>
>>> My question wasn't about how such a thing could be implemented, but how it works with const ranges.
>>>
>>> foreach(x; someConstRange) I think wouldn't be possible. I think you'd have to recurse:
>>>
>>> void process(const Range r)
>>> {
>>>    subProcess(r.front);
>>>    process(r.rest);
>>> }
>>>
>>> The point is to question the statement "so that we can have `const` and `immutable` ranges".
>>>
>>> Sure, we could implement recursive versions of find, etc. I don't know if that's worth it.
>>>
>>
>> Well, currently, range algorithms can't work with const ranges *at all*, recursively or iteratively. So from a user perspective, this would be a strict improvement on the status quo.
> 
> Algorithms can work with const ranges -- as long as the range is an array:
> 
> const int[] arr = [1, 2, 3, 4, 5];
> 
> auto x = arr.find(3);
> 
> assert(x == [3, 4, 5]);
> 
> I think the better option is to focus on making it possible to duplicate this possibility for generic ranges rather than implement a new and awkward API.
> 
> FeepingCreature is right, we should try and create a head mutable mechanism. I have envisioned it in the past as a tail-modifier mechanism (e.g. tail-const).

Oh, indeed. Arrays are special cased by the compiler - const(T[]) is silently converted to const(T)[] when calling a function.

At a point there was a discussion about allowing a similar conversion to be done automatically by the compiler - opCall(). So whenever you pass an object to a function, if the type defines opCall, it would be automatically invoked.
June 22, 2020
On 6/22/20 11:07 AM, jmh530 wrote:
> On Monday, 22 June 2020 at 14:33:53 UTC, Steven Schveighoffer wrote:
>> [snip]
>>
>> FeepingCreature is right, we should try and create a head mutable mechanism. I have envisioned it in the past as a tail-modifier mechanism (e.g. tail-const).
>>
>> -Steve
> 
> I just read a thread from 2010 on tail const/head mutable. Kind of amazing that it is a difficult enough problem that it still hasn't been resolved.

I don't think it's that difficult, it's that it has a rather large footprint. Qualifiers weigh heavily on the language.

I think going the lowering route with opCall() would be a much easier route, and also more expressive.
June 22, 2020
On Mon, Jun 22, 2020 at 11:52:47AM -0400, Andrei Alexandrescu via Digitalmars-d wrote: [...]
> At a point there was a discussion about allowing a similar conversion to be done automatically by the compiler - opCall(). So whenever you pass an object to a function, if the type defines opCall, it would be automatically invoked.

This conflicts with the function call operator, which is also called .opCall.  I wouldn't want the function objects I pass around to be "accidentally" invoked just because it defines .opCall!


T

-- 
In a world without fences, who needs Windows and Gates? -- Christian Surchi
June 22, 2020
On 6/21/20 1:43 PM, Stanislav Blinov wrote:
> Input ranges, by nature being one-pass, *should not be copyable*. You can't do anything (good) with a copy, and have to invest into implementing a copy that won't bite. If you're giving such range away - you're giving it *away*, to someone else to consume. It being copyable only means that you're leaving for yourself a mutable reference to state that you shouldn't touch again. When you need the remainder back - your callee will move it back.

Good arguments, no doubt, but a long experience with noncopyable C++ objects suggests that defining such types need to be approached with trepidation as they are very cumbersome to use. (Any type containing a noncopyable type as a member becomes itself noncopyable; this is not something that we can inflict lightly on the casual users of input ranges. (To wit: no input streams or iterators in C++ are noncopyable, although the same argument would apply to them as well.)

Also, it is not unheard of to have two input ranges fed from the same source with the obvious semantics (whoever calls functions to get more data will get the data and push the cursor further). True, buffering is an issue (what if two copies have each their own buffers?) but that's an engineering problem, not one of principles.

It is quite clear to me that we can't propose a design with noncopyable input ranges without effectively making them pariahs that everybody will take pains to use and do their best to avoid.
June 22, 2020
On Monday, 22 June 2020 at 15:47:51 UTC, Steven Schveighoffer wrote:
> On 6/22/20 10:55 AM, Paul Backus wrote:
>> 
>> This isn't really "algorithms working with const ranges"; rather, it's "const(T[]) implicitly converts to const(T)[]". The algorithm itself never sees a const range.
>
> I don't see a difference. When you copy a range as a parameter, the head is a different piece of memory. This is why it works. Why is it important how it works?

It's important how it works because it *doesn't* work, for ranges in general. It only works for slices.

Making it work in general requires either subverting the type system (like Rebindable does) or adding more special cases to the language.

> Perhaps you are thinking of how templates automatically strip the head const? That I don't necessarily agree with either, but there isn't a good way to say "take this parameter as head-mutable" in generic code, which is why *that* was added. If we did have a mechanism to say that (such as FeepingCreture's `headmut` example), then we wouldn't need that special treatment.

The question is, why do you need a mechanism to say that in the first place? Why not just have generic code take the argument as its actual type?

Answer: because popFront requires a mutable object.

`headmut` is a solution to a problem that doesn't need to exist in the first place. Get rid of popFront, and you get rid of the problem.
June 22, 2020
On 6/22/20 12:01 PM, H. S. Teoh wrote:
> On Mon, Jun 22, 2020 at 11:52:47AM -0400, Andrei Alexandrescu via Digitalmars-d wrote:
> [...]
>> At a point there was a discussion about allowing a similar conversion
>> to be done automatically by the compiler - opCall(). So whenever you
>> pass an object to a function, if the type defines opCall, it would be
>> automatically invoked.
> 
> This conflicts with the function call operator, which is also called
> .opCall.  I wouldn't want the function objects I pass around to be
> "accidentally" invoked just because it defines .opCall!

Oh, sorry. The name was different - possibly opOnCall.

All in all it's a matter of deciding on how important this problem is. (I think it is.)

June 22, 2020
On 6/22/20 12:14 PM, Paul Backus wrote:
> Making it work in general requires either subverting the type system (like Rebindable does) or adding more special cases to the language.

The principled approach is to generalize the trick currently used only for arrays. Consider the lowering:

"Whenever something is passed to a function, .opOnCall is appended."

Then:

auto ref T opOnCall(T)(auto ref T x) { return x; }
const(T)[] opOnCall(T)(const(T[]) x) { return x; }
immutable(T)[] opOnCall(T)(immutable(T[]) x) { return x; }

This is just a sketch, haven't thought it through, but if properly defined the scheme would work having built-in arrays work as they are today, and would also allow user-defined types to define their own opOnCall as they find fit.
June 22, 2020
On Monday, 22 June 2020 at 16:42:55 UTC, Andrei Alexandrescu wrote:
> On 6/22/20 12:14 PM, Paul Backus wrote:
>> Making it work in general requires either subverting the type system (like Rebindable does) or adding more special cases to the language.
>
> The principled approach is to generalize the trick currently used only for arrays. Consider the lowering:
>
> "Whenever something is passed to a function, .opOnCall is appended."
>
> Then:
>
> auto ref T opOnCall(T)(auto ref T x) { return x; }
> const(T)[] opOnCall(T)(const(T[]) x) { return x; }
> immutable(T)[] opOnCall(T)(immutable(T[]) x) { return x; }
>
> This is just a sketch, haven't thought it through, but if properly defined the scheme would work having built-in arrays work as they are today, and would also allow user-defined types to define their own opOnCall as they find fit.

The trick used for arrays does not only apply to function calls:

    const(int[]) a = [1, 2, 3];
    const(int)[] b = a; // compiles

IMHO the principled way to allow user-defined implicit conversions is...to allow user-defined implicit conversions. But iirc that's a can of worms Walter prefers not to open.
June 22, 2020
On 6/22/20 12:50 PM, Paul Backus wrote:
> On Monday, 22 June 2020 at 16:42:55 UTC, Andrei Alexandrescu wrote:
>> On 6/22/20 12:14 PM, Paul Backus wrote:
>>> Making it work in general requires either subverting the type system (like Rebindable does) or adding more special cases to the language.
>>
>> The principled approach is to generalize the trick currently used only for arrays. Consider the lowering:
>>
>> "Whenever something is passed to a function, .opOnCall is appended."
>>
>> Then:
>>
>> auto ref T opOnCall(T)(auto ref T x) { return x; }
>> const(T)[] opOnCall(T)(const(T[]) x) { return x; }
>> immutable(T)[] opOnCall(T)(immutable(T[]) x) { return x; }
>>
>> This is just a sketch, haven't thought it through, but if properly defined the scheme would work having built-in arrays work as they are today, and would also allow user-defined types to define their own opOnCall as they find fit.
> 
> The trick used for arrays does not only apply to function calls:
> 
>      const(int[]) a = [1, 2, 3];
>      const(int)[] b = a; // compiles

That's different - it's an implicit conversion.

> IMHO the principled way to allow user-defined implicit conversions is...to allow user-defined implicit conversions. But iirc that's a can of worms Walter prefers not to open.

What happens upon function calls is not an implicit conversion. It's a forced type change.
June 22, 2020
On Monday, 22 June 2020 at 21:46:51 UTC, Andrei Alexandrescu wrote:
> On 6/22/20 12:50 PM, Paul Backus wrote:
>> 
>> The trick used for arrays does not only apply to function calls:
>> 
>>      const(int[]) a = [1, 2, 3];
>>      const(int)[] b = a; // compiles
>
> That's different - it's an implicit conversion.
>
>> IMHO the principled way to allow user-defined implicit conversions is...to allow user-defined implicit conversions. But iirc that's a can of worms Walter prefers not to open.
>
> What happens upon function calls is not an implicit conversion. It's a forced type change.

So what you're saying is, it's even *less* principled than I thought? :)

Regardless, my broader point is that once we're open to the possibility of designing a new range API, all of this can be solved without any language changes by using an API that doesn't require mutation (i.e., tail()). Surely that's a better solution than implicitly inserting calls to arbitrary user-defined code every time someone passes an argument to a function.