May 09, 2013
On 10 May 2013 08:50, Timon Gehr <timon.gehr@gmx.ch> wrote:

> On 05/09/2013 11:35 PM, Manu wrote:
>
>> ...
>>
>>
>> I don't think this is entirely true, auto ref is a template concept,
>>
>
> In the current implementation, but not necessarily.


It should stay that way. It's reasonable what it does in it's current implementation.

 that is, "automatic ref-ness", it selects the ref-ness of the argument
>> automatically, at compile time, just like auto applied everywhere else (selects a type for instance, at compile time).
>>
>
> auto was carried over from C and originally stands for local lifetime. It does _not_ mean "apply type deduction here".


Eh? 'local lifetime' doesn't sound like it has anything to do with 'apply type deduction here' to me; which is what D does.

 This concept doesn't
>> make any sense applied to a non-template. It *IS* a ref as specified by the programmer, there's nothing 'automatic' about it.
>>
>>
> Most keywords are poorly chosen.
>

Is this an argument to continue that trend?
That said, I don't find this to be particularly true. Most things make
reasonable sense.

 So to say it will do 'exactly the same thing' is a misunderstanding. I
>> argue that 'auto ref' as applied to non-templates will only create confusion, it effectively re-enforces the type of confusion that you have just shown.
>>
>> This is the reasoning for the argument behind scope ref, which to my
>> mind actually makes good sound sense, and should lead people to a proper
>> understanding of what you are actually doing.
>> Considering the key argument against 'scope ref' is that people don't
>> want to require more attributes to make use of it,
>>
>
> This is inaccurate.


It's the most consistent argument against scope-ref.


May 10, 2013
On 10 May 2013 09:01, Rob T <alanb@ucora.com> wrote:

> On Thursday, 9 May 2013 at 21:55:51 UTC, Manu wrote:
>
>> Umm, what if the non-template counterpart returns ref? Then it doesn't behave the same.
>>
>
> Yes, the non-template version cannot ref return an auto ref param, but I would expect that neither can the template version unless auto ref is specified as the return value. The key difference that I think can be understood is that auto ref on the return value for a template is not the same thing as auto ref on the param.
>
> For templates, you can enforce exactly the same behavior as the non-template version by specifying a return of either ref or not, ie you do not specify auto ref or you specify a return by value.
>
> If you so happen to specify a ref return of an auto ref param, I would expect the compiler to refuse to compile no matter if it is a template function or not. Of course you can confuse the programmer and allow such a thing to compile only for the cases where it can return ref on the auto ref, but IMO that'll be a big mistake as it will confuse the s*** out of most people, and besides allowing something like that has no value at all.
>
>  D is a complex language, so stuff like this does take some getting used
>> to,
>>
>>> but it is very powerful and flexible, no two ways around it.
>>>
>>>
>> If this takes 'getting used to', you're basically admitting that it
>> doesn't
>> make intuitive sense.
>>
>
> Well, I suppose I cannot disagree with you on that point, so yes it is confusing, but for sake of a solution, it is not nearly as confusing so long as auto ref on the parameters will behave the same for both template and non-template versions in a consistent way. I know auto ref on the return is potentially confusing, but it does increase template flexibility as the benefit.
>

It IS confusing that auto-ref would do 2 completely different things. One automatically selecting ref-ness, the other saying "i can safely receive a temporary". There is nothing 'automatic' about the latter.

Alternatively, I think Jonathan's argument against scope ref makes perfect
> sense. Unless I misread something or he's dead wrong, how can scope be used without creating even more confusion? Even if scope has some properties in common with auto ref, it specifies something entirely different than accepting rvalues and lvalues. This point reminds me of why we should not be using bool as an integral type, it's the same reasoning.


Which argument?

Correct, it specifies something _entirely different_, it says "I can safely receive a temporary, because I promise not to escape it". This is the actual problem that we're trying to solve, and it addresses the problem head on.

As I've had to re-iterate countless times, and such is the massive fallacy behind all of these threads, this whole debate is NOT about lvalues/rvalues, and I wish people would stop using the term 'rvalue' in their posts, I worry that they misunderstand the problem every time it's said.

This code is broken:
  void f(ref int x) {}
  int x;
  f(x);

x is an lvalue.
This is the real problem case, and addressing this will solve the rvalue
case at the same time.
Passing an rvalue to a function just generates an implicit temp which is
functionally identical to the above, except the lifetime of a temp is
usually the life of the statement rather than the outer scope.
The problem we need to solve is that of a function being able to safely
receive a _temporary_.


May 10, 2013
On 10 May 2013 09:09, Timon Gehr <timon.gehr@gmx.ch> wrote:

> On 05/10/2013 12:42 AM, Manu wrote:
>
>> On 10 May 2013 08:09, Jonathan M Davis <jmdavisProg@gmx.com
>> <mailto:jmdavisProg@gmx.com>> wrote:
>> ...
>>
>>     So, using scope ref to
>>     solve that problem makes no sense, and the changes that we've
>>     proposed to make
>>     to ref to make it @safe pretty much make scope unnecessary.
>>
>>
>> I agree, that's why I'm also happy with 'ref' alone, but I still feel it doesn't communicate as much information, which is a trivial by contrast.
>>
>>     scope ref would be
>>     virtually identical to ref given that ref already has to guarantee
>>     that the
>>     variable being referenced doesn't get destroyed before the ref is.
>>
>>
>> No, there are other bonuses:
>>   - It mechanically enforces a given argument will not have a pointer
>> taken and escape.
>>
>
> This is the same for 'ref' in @safe code in the final implementation.


Fine. @safe is expected to place more restrictions. But it's still something that people need to do sometimes in un-@safe code.

   - It gives the extra information to the programmer who can better
>> reason about API intent.
>>
>
> No. scope is supposed to restrict escaping (in some way that is still to be determined). If it is overloaded to also mean 'accept rvalues', then reasoning about API intent is actually harmed, because it will not clear whether 'scope' was added to restrict escaping alone or also to accept rvalues.


The fact that it can safely receive a temporary is implicit if there is a guarantee that it will not escape.

Why should explicit syntax exist to say "I accept rvalues", and not "I can safely receive temporaries"? The rvalue case is a subset, and I see no reason for it to receive special treatment.

   - It allows 'ref' alone to retain an important function where it may
>> escape a pointer if it wants to.
>>
>>
> In @safe code? No way.
>

Sure, people expect restrictions in @safe code. But it's still something that people need to do sometimes in un-@safe code.

     The only
>>     real difference would be that scope would presumably additionally
>>     prevent doing
>>     @system stuff like taking the address of the ref. I don't see how
>>     this buys us
>>     anything.
>>
>>
>> Yes this is an advantage, I listed it above. It buys the programmer some additional flexibility/choice.
>>
>>     I agree that auto ref isn't a great name, but it's what we already
>>     have, and
>>     using it on non-templated functions would be using it for exactly
>>     what it was
>>     designed for in the first place and how it's described in TDPL.
>>
>>
>> I'm not going to change my position that it makes no sense, and is
>> misleading/confusing without some real arguments, which nobody seems
>> able to provide.
>> auto ref has already shown to create misunderstanding in the minds of
>> non-super-technical programmers.
>>
>
> I think that this should be an oxymoron.


What?

 Syntax should encourage correct understanding.
>>
>>
> It can't. FWIW overloading scope fails this requirement badly.
>

It's not an overload, it's a natural extension of the concept. Using auto is an overload!

    As has already been discussed in this thread, it will introduce
>>     maintenance
>>     problems if ref accepts rvalues.
>>
>>
>> I'm not bothered by that personally, but if it's critically important, then we start arguing scope ref again. Otherwise I am happy to accept 'ref' with new added safety.
>>
>
> Either scope ref will turn out to be liable to similar issues, or a keyword will have been wasted.
>

I don't understand? Can you expand this comment?


May 10, 2013
On 10 May 2013 09:20, Rob T <alanb@ucora.com> wrote:

> On Thursday, 9 May 2013 at 22:42:14 UTC, Manu wrote:
>
>> And it's
>>
>>> even questionable that scope as originally intended can be properly implemented anyway.
>>>
>>>
>> ...so, the problem is no different than 'auto ref' as you mention above.
>> It's not implemented as drafted, and we're debating what's actually
>> correct. Clearly the draft was incomplete in both cases.
>> I only support the proposal (from others) that scope ref makes so much
>> more
>> sense, and I think we've also proven it can be made to work syntactically
>> without holes, which I don't believe is so for auto ref.
>>
>>
> However despite the elusiveness of a solution, it looks like we'll be able to implement auto ref as was originally intended. We may also be able to implement scope as was originally intended, but not if we use it for another purpose.
>

Except that auto ref as originally intended seems to have been a flawed design, as evidenced by the massive waves this issue keeps creating.

the scope ref proposal does not interfere with scope as originally intended, it is a natural extension of the concept... unless I don't understand scope as originally intended properly (which is possible, it's barely documented).

In any event  you may want to use scope ref to prevent escapes and also
> refuse to use rvalues, so it is not a good solution for that reason alone.


Why? Why would a function want to receive a temporary but not an implicit temporary?


May 10, 2013
On Friday, May 10, 2013 10:08:37 Manu wrote:
> Correct, it specifies something _entirely different_, it says "I can safely receive a temporary, because I promise not to escape it". This is the actual problem that we're trying to solve, and it addresses the problem head on.
> 
> As I've had to re-iterate countless times, and such is the massive fallacy behind all of these threads, this whole debate is NOT about lvalues/rvalues, and I wish people would stop using the term 'rvalue' in their posts, I worry that they misunderstand the problem every time it's said.
> 
> This code is broken:
> void f(ref int x) {}
> int x;
> f(x);
> 
> x is an lvalue.
> This is the real problem case, and addressing this will solve the rvalue
> case at the same time.
> Passing an rvalue to a function just generates an implicit temp which is
> functionally identical to the above, except the lifetime of a temp is
> usually the life of the statement rather than the outer scope.
> The problem we need to solve is that of a function being able to safely
> receive a _temporary_.

The runtime check for ref that we agreed on already solves the @safety problem. So, I see no point in discussing the @safety problem further unless there's something wrong with the runtime check solution. And yes, the @safety problem is not just a question of rvalues. But whether ref should accept rvalues is very important with regards to being able to write and understand correct and maintainable code.

The question of accepting rvalues that we are therefore discussing has _nothing_ to do with @safety. It's entirely a question of avoiding other types of bugs - like accepting nonsense like swap(5, 7), which in that case is fortunately obvious but is not obvious in the general case. IMHO, it needs to be clear when a function intends to take an argument by ref because it intends to mutate the argument and when it intends to take an argument by ref because it wants the efficiency boost of avoiding the copy. In the first case, it makes no sense to accept rvalues, and in the second case, you definitely want to accept rvalues. As such, having different syntax is needed (be it auto ref or @acceptrvalue or whatever).

I'm not entirely against adding a new attribute for that (it would have the added benefit of not needing a compiler optimization to guarantee that a templated function takes its argument by ref when passed an rvalue), but Walter and Andrei don't want to add new attributes if they can avoid it, so I don't expect them to be okay with adding a new attribute. And since auto ref was originally supposed to be this attribute, I'd _much_ rather have that do it than make the mistake of letting ref accept rvalues.

- Jonathan M Davis
May 10, 2013
On 10 May 2013 10:31, Jonathan M Davis <jmdavisProg@gmx.com> wrote:

> On Friday, May 10, 2013 10:08:37 Manu wrote:
> > Correct, it specifies something _entirely different_, it says "I can
> safely
> > receive a temporary, because I promise not to escape it". This is the actual problem that we're trying to solve, and it addresses the problem head on.
> >
> > As I've had to re-iterate countless times, and such is the massive
> fallacy
> > behind all of these threads, this whole debate is NOT about lvalues/rvalues, and I wish people would stop using the term 'rvalue' in their posts, I worry that they misunderstand the problem every time it's said.
> >
> > This code is broken:
> > void f(ref int x) {}
> > int x;
> > f(x);
> >
> > x is an lvalue.
> > This is the real problem case, and addressing this will solve the rvalue
> > case at the same time.
> > Passing an rvalue to a function just generates an implicit temp which is
> > functionally identical to the above, except the lifetime of a temp is
> > usually the life of the statement rather than the outer scope.
> > The problem we need to solve is that of a function being able to safely
> > receive a _temporary_.
>
> The runtime check for ref that we agreed on already solves the @safety
> problem. So, I see no point in discussing the @safety problem further
> unless
> there's something wrong with the runtime check solution. And yes, the
> @safety
> problem is not just a question of rvalues. But whether ref should accept
> rvalues is very important with regards to being able to write and
> understand
> correct and maintainable code.
>
> The question of accepting rvalues that we are therefore discussing has
> _nothing_ to do with @safety. It's entirely a question of avoiding other
> types
> of bugs - like accepting nonsense like swap(5, 7), which in that case is
> fortunately obvious but is not obvious in the general case. IMHO, it needs
> to
> be clear when a function intends to take an argument by ref because it
> intends
> to mutate the argument and when it intends to take an argument by ref
> because
> it wants the efficiency boost of avoiding the copy. In the first case, it
> makes
> no sense to accept rvalues, and in the second case, you definitely want to
> accept rvalues. As such, having different syntax is needed (be it auto ref
> or
> @acceptrvalue or whatever).
>
> I'm not entirely against adding a new attribute for that (it would have the
> added benefit of not needing a compiler optimization to guarantee that a
> templated function takes its argument by ref when passed an rvalue), but
> Walter and Andrei don't want to add new attributes if they can avoid it,
> so I
> don't expect them to be okay with adding a new attribute. And since auto
> ref
> was originally supposed to be this attribute, I'd _much_ rather have that
> do
> it than make the mistake of letting ref accept rvalues.
>

What were the arguments again against ref const()? You're talking about
making it clear that the function isn't planning on mutating the given
rvalue...
I understand that const is stronger than C++, but is it actually a
deal-breaker? It's the most logical fit here.


May 10, 2013
On Friday, May 10, 2013 10:38:47 Manu wrote:
> What were the arguments again against ref const()? You're talking about
> making it clear that the function isn't planning on mutating the given
> rvalue...
> I understand that const is stronger than C++, but is it actually a
> deal-breaker? It's the most logical fit here.

Not being able to differentiate between when it's an rvalue and lvalue causes some problems, though Andrei understands those issues far better than I do. I've been able to come up with cases where it would be a problem if you could have ref on a local variable, but without that, I don't really understand where you end up with problems with const ref due to not being able to differentiate between rvalues and lvalues. It's easy to come up with cases with plain ref (e.g. the example in Ali's talk when a type's opAssign treated rvalues and lvalues differently, swapping guts with the rvalue and copying from the lvalue), but I'm not as clear on when it ends up being an issue with const. The fact that we don't allow ref on anything other than parameters and return types really simplifies things, so I'm not sure that the situation with const ref would be anywhere near as bad in D as Andrei thinks that it is in C++, but I don't know.

However, regardless of all that, the primary problem that I see with using const ref to indicate that you want to accept rvalues by ref is the fact that D's const is so restrictive, and there are types where const just plain doesn't work for them. As such, it strikes me as a bad idea to require const in order to accept rvalues by ref even if there are no other problems with it.

Now, you could end up with other weird problems if you used auto ref (assuming that we used auto ref for this) without const and the function _did_ mutate its argument (we already have that problem with auto ref and templates), but if you really want the extra protection and can afford it, you can always use const with it. The problem is the cases where you can't use const. We've been trying very hard to make it so that const is completely optional precisely because of how restrictive it is. Requiring it for this seems problematic to me.

- Jonathan M Davis
May 10, 2013
On Thu, 09 May 2013 17:45:33 -0400, Peter Alexander <peter.alexander.au@gmail.com> wrote:

> On Thursday, 9 May 2013 at 20:59:14 UTC, Steven Schveighoffer wrote:
>> I should restate: The idea that restricting rvalues from binding to refs *except* the case where it binds to 'this' is a hack.  The binding to 'this' pre-dates the restriction.
>
> Ah ok. I do agree that the asymmetry is quite hacky, although binding to 'this' isn't quite the same as binding to ref.

It's exactly the same actually.  'this' is an implicit ref argument.  I don't see how it's any different.

We also currently have a hack where ++ doesn't work on rvalues artificially, but an .increment() function does.

>>> I'm not sure about how common it is, but I really don't like the idea of calls like swap(1, 2) being legal. Seems like a step backward from C++.
>>
>> Why is it so bad that that is legal?  Really hard to stop people from writing incorrect code.
>
> You could apply the same argument against static typing, e.g. "swap(apple, orange) looks wrong, so why is it so bad not to catch the type error at compile time?"
>
> I think we can agree that while it's not possible to stop people from writing incorrect code 100% of the time, there is still benefit to catching as many cases as practical.

I define practical as catching as many things that shouldn't work as you can without making it impossible to write things that should.

-Steve
May 10, 2013
On 5/9/13 4:36 PM, Peter Alexander wrote:
> I'm not sure about how common it is, but I really don't like the idea of
> calls like swap(1, 2) being legal. Seems like a step backward from C++.

I think if we ever get swap(1, 2) to compile and run we'd effectively have destroyed the D programming language.

Andrei
May 10, 2013
On 5/9/13 5:55 PM, Manu wrote:
> auto is a template concept, it should not be applied here.

auto x = 5;

Where's the template?


Andrei