December 05, 2014
On Friday, 5 December 2014 at 21:32:53 UTC, Walter Bright wrote:
>>> I don't believe this is correct. Rvalues can be assigned, just like:
>>>
>>>   __gshared int x;
>>>   { int i; x = i; }
>>>
>>> i's scope ends at the } but it can still be assigned to x.
>>>
>>
>> It work even better when i has indirections.
>
> I understand what you're driving at, but only a scoped rvalue would not be copyable.
>

The DIP say nothing about scoped rvalue having different behavior
than non scoped ones.

>> I originally had scope only apply to ref, but that made
>>> having scoped classes
>>> impossible.
>>>
>>
>> Promoting scoped class on stack is an ownership problem, and out
>> of scope (!). It make sense to allow it as an optimization.
>>
>> Problem is, lifetime goes to infinite after indirection, so I'm
>> not sure what the guarantee is.
>
> The guarantee is there will be no references to the class instance after the scoped class goes out of scope.
>

Through use of that view. I see it as follow:
  - When unconsumed owner goes out of scope, it can (must ?) be
free automatically.
  - scope uses do not consume.
  - When the compiler see a pair of alloc/free, it can promote on
stack.

That is a much more useful definition as it allow for stack
promotion after inlining:
class FooBuilder {
     Foo build() { return new Foo(); }
}

class Foo {}

void bar() {
     auto f = new FooBuilder().build();
     // Use f and do not consume...
}

This can be reduced in such a way no allocation happen at all.

>> I cause everything reached through the view to be scope and
>> obliviate the need for things like &(*e) having special meaning.
>
> Are you suggesting transitive scope?

For rvalues, yes. Not for lvalues.
December 06, 2014
Walter Bright:

> The question, obviously, is what is good enough to get the job done. By "the job", I mean reference counting, migrating many allocations to RAII (meaning the stack), and eliminating a lot of closure GC allocations.

Another essential purpose of a similar proposal is to avoid some bugs caused by using memory from vanished stack frames, etc, that is to have memory safety in D.

Bye,
bearophile
December 06, 2014
On Friday, 5 December 2014 at 20:55:55 UTC, Walter Bright wrote:
> On 12/5/2014 7:27 AM, Steven Schveighoffer wrote:
>> Can someone who knows what this new feature is supposed to do give some Ali
>> Çehreli-like description on the feature? Basically, let's strip out the *proof*
>> in the DIP (the how it works and why we have it), and focus on how it is to be
>> used.
>>
>> I still am having a hard time wrapping my head around the benefits and when to
>> use scope, scope ref, why I would use it. I'm worried that we are adding all
>> this complication and it will confuse the shit out of users, to the point where
>> they won't use it.
>
> The tl;dr version is when a declaration is tagged with 'scope', the contents of that variable will not escape the lifetime of that declaration.
>
> It means that this code will be safe:
>
>    void foo(scope int* p);
>
>    p = malloc(n);
>    foo(p);
>    free(p);
>
> The rest is all the nuts and bolts of making that work.

<brainstorm>
What about also adding the inverse of scope? Then scope can be inferred. As in:

```
void foo(int* p);
void free(P)(consume P* p);

p = malloc(n);
foo(p);  // here foo() gets scope, since free consumes it.
free(p);
```

So you do not need to write scope everywhere. And it would prevent this code:

```
{
  free(p);
  free(p);  // Error: p already consumed
}
```
</brainstorm>
December 06, 2014
On 5 December 2014 at 07:14, Walter Bright via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
> On 12/4/2014 10:45 AM, H. S. Teoh via Digitalmars-d wrote:
>>
>> However, AFAIK, template *classes* trigger attribute inference on its (non-template) member functions, so this would be problematic:
>>
>>         class Base(T) {
>>                 T data;
>>                 void method(ref T); // inferred to be scope
>>         }
>>
>>         class Derived : Base!int {
>>                 override void method(ref T); // oops, cannot override
>>         }
>
>
> I agree, it's a good point. Scope inference cannot be done for virtual functions. I amended the DIP.

No comment...
December 06, 2014
On 6 December 2014 at 09:58, Walter Bright via Digitalmars-d <digitalmars-d@puremagic.com> wrote:
> On 12/5/2014 8:48 AM, "Marc Schütz" <schuetzm@gmx.net>" wrote:
>>
>> There are limitations this proposal has in comparison to my original one.
>> These
>> limitations might of course be harmless and play no role in practice, but
>> on the
>> other hand, they may, so I think it's good to list them here.
>
>
> Good idea. Certainly, this is less powerful than your proposal. The question, obviously, is what is good enough to get the job done. By "the job", I mean reference counting, migrating many allocations to RAII (meaning the stack), and eliminating a lot of closure GC allocations.
>
>
>> Additionally I have to agree with Steven Schveighoffer: This DIP is very
>> complicated to understand. It's not obvious how the various parts play
>> together,
>> and why/to which degree it "works", and which are the limitations. I don't
>> think
>> that's only because my brain is already locked on my proposal...
>
>
> I'm still looking for an easier way to explain it. The good news in this, however, is if it is correctly implement the compiler should be a big help in using scope correctly.
>
>
>> 1) Escape detection is limited to `ref`.
>>
>>      T* evil;
>>      ref T func(scope ref T t, ref T u) @safe {
>>        return t; // Error: escaping scope ref t
>>        return u; // ok
>>        evil = &u; // Error: escaping reference
>>      }
>>
>> vs.
>>
>>      T[] evil;
>>      T[] func(scope T[] t, T[] u) @safe {
>>        return t; // Error: cannot return scope
>>        return u; // ok
>>        evil = u; // !!! not good
>
>
> right, although:
>      evil = t;  // Error: not allowed
>
>>      }
>>
>> As can be seen, `ref T u` is protected from escaping (apart from returning
>> it),
>> while `T[] u` in the second example is not. There's no general way to
>> express
>> that `u` can only be returned from the function, but will not be retained
>> otherwise by storing it in a global variable. Adding `pure` can express
>> this in
>> many cases, but is, of course, not always possible.
>
>
> As you point out, 'ref' is designed for this.
>
>
>> Another workaround is passing the parameters as `ref`, but this would
>> introduce
>> an additional indirection and has different semantics (e.g. when the
>> lengths of
>> the slices are modified).
>>
>> 2) `scope ref` return values cannot be stored.
>>
>>      scope ref int foo();
>>      void bar(scope ref int a);
>>
>>      foo().bar();        // allowed
>>      scope tmp = foo();  // not allowed
>>      tmp.bar();
>
>
> Right
>
>
>> Another example:
>>
>>      struct Container(T) {
>>          scope ref T opIndex(size_t index);
>>      }
>>
>>      void bar(scope ref int a);
>>
>>      Container c;
>>      bar(c[42]);            // ok
>>      scope ref tmp = c[42]; // nope
>>
>> Both cases should be fine theoretically; the "real" owner lives longer
>> than
>> `tmp`. Unfortunately the compiler doesn't know about this.
>
>
> Right, though the compiler can optimize to produce the equivalent.
>
>> Both restrictions 1) and 2) are because there are no explicit
>> lifetime/owner
>> designations (the scope!identifier thingy in my proposal).
>>
>> 3) `scope` cannot be used for value types.
>>
>> I can think of a few use cases for scoped value types (RC and file
>> descriptors),
>> but they might only be marginal.
>
>
> I suspect that one can encapsulate such values in a struct where access to them is strictly controlled.
>
>
>> 4) No overloading on `scope`.
>>
>> This is at least partially a consequence of `scope` inference. I think
>> overloading can be made to work in the presence of inference, but I
>> haven't
>> thought it through.
>
>
> Right. Different overloads can have different semantic implementations, so what should inference do? I also suspect it is bad style to overload on 'scope'.
>
>
>> 5) `scope` is a storage class.
>>
>> Manu complained about `ref` being a storage class. If I understand him
>> right,
>> one reason is that we have a large toolkit for dealing with type
>> modifiers, but
>> almost nothing for storage classes. I have to agree with him there. But I
>> haven't understood his point fully, maybe he himself can post more about
>> his
>> problems with this?
>
>
> I didn't fully understand Manu's issue, but it was about 'ref' not being inferred by template type deduction. I didn't understand why 'auto ref' did not work for him. I got the impression that he was trying to program in D the same way he'd do things in C++, and that's where the trouble came in.

NO!!
I barely program C++ at all! I basically write C code with 'enum' and
'class'. NEVER 'template', and very rarely 'virtual'.
Your impression is dead wrong.

If I do things in any particular 'way', it is that at all times, I
have regard for, and control over the code generation, the ABI, and I
also value distribution of code as static libs, which means I must
retain tight control over where code is, and isn't, emitted.
I expect this from a native language.


It's exactly as Marc says, we have the best tools for dealing with
types of any language I know, and practically none for dealing with
'storage class'.
In my use of meta in D, 'ref' is the single greatest cause of
complexity, code bloat, duplication, and text mixins. I've been
banging on about this for years!

I've been over it so many times.
I'll start over if there is actually some possibility I can convince
you? Is there?

I've lost faith that I am able to have any meaningful impact on issues
that matter to me, and I'm fairly sure at this point that my
compounded resentment and frustration actually discredit my cause, and
certainly, my quality of debate.
I'm passionate about these things, but I'm also extremely frustrated.
To date, whenever I have engaged in a topic that I *really* care
about, it goes the other direction. To participate has proven to be
something of a (very time consuming) act of masochism.


I __absolutely objected__ to 'auro ref' when it appeared. I argued that it was a massive mistake, and since it's introduction and experience with it in the wild, I am more confident in that conviction than ever.

As a programmer, I expect control over whether code is a template, or
not. 'ref' doesn't have anything to do with templates. Confusing these
2 concepts was such a big mistake.
auto ref makes a template out of something that shouldn't be a
template. It's a particularly crude hack to address a prior mistake.
ref should have been fixed at that time, not compounded with layers of
even more weird and special-case/non-uniform behaviours above it.


There has been a couple of instances where the situation has been
appropriate that I've tried to make use of auto ref, but in each case,
the semantics have never been what I want.
auto ref presumes to decide for you when something should be a ref or
not. It prescribes a bunch of rules on how that decision is made, but
it doesn't know what I'm doing, and it gets it wrong.
In my experience, auto ref has proven to be, at best, completely
useless. But in some cases I've found where I bump into it in 3rd
party code, it's been a nuisance, requiring me to wrap it away.


ref is a bad design. C++'s design isn't fantastic, and I appreciate
that D made effort to improve on it, but we need to recognise when the
experiment was a failure. D's design is terrible; it's basically
orthogonal to the rest of the language. It's created way more
complicated edge cases for me than C++ references ever have. Anyone
who says otherwise obviously hasn't really used it much!
Don't double down on that mistake with scope.


My intended reply to this post was to ask you to justify making it a
storage class, and why the design fails as a type constructor?
Can we explore that direction to it's point of failure?

As a storage class, it runs the risk of doubling out existing bloat
caused by ref. As a type constructor, I see no disadvantages.
It even addresses some of the awkward problems right at the face of
storage classes, like how/where the attributes actually apply. Type
constructors use parens; ie, const(T), and scope(T) would make that
matter a lot  more clear.

Apart from the storage class issue, it looks okay, but it gives me the
feeling that it kinda stops short.
Marc's proposal addressed more issues. I feel this proposal will
result in more edge cases than Marc's proposal.
The major edge case that I imagine is that since scope return values
can't be assigned to scope local's, that will result in some awkward
meta, requiring yet more special cases. I think Mark's proposal may be
a lot more relaxed in that way.

December 06, 2014
On Friday, 5 December 2014 at 16:48:45 UTC, Marc Schütz wrote:
> There are limitations this proposal has in comparison to my original one. These limitations might of course be harmless and play no role in practice, but on the other hand, they may, so I think it's good to list them here.
>

One concern I had with your proposal was that it refers to a symbol before it's available.

scope!haystack(string) findSubstring(scope(string) haystack, scope(string) needle)

C++11 had similar issues and they solved it by introducing trailing return types.

ex:
"template<class T>
auto mul(T a, T b) -> decltype(a*b)"

Personally I would prefer not to go down that lane and DIP69 avoids that problem.
December 06, 2014
On 2014-12-06 10:50, Manu via Digitalmars-d wrote:

> I've been over it so many times.

I suggest you take the time and write down how your vision of "ref" looks like and the issue with the current implementation. A blog post, a DIP or similar. Then you can easily refer to that in cases like this. Then you don't have to repeat yourself so many times. That's what I did when there was a lot of talk about AST macros. I was tired of constantly repeating myself so I created a DIP. It has already saved me more time than it took to write the actual DIP.

-- 
/Jacob Carlborg
December 06, 2014
On Saturday, 6 December 2014 at 04:31:48 UTC, Sebastiaan Koppe wrote:
> What about also adding the inverse of scope? Then scope can be inferred. As in:
>
> ```
> void foo(int* p);
> void free(P)(consume P* p);


Yes, this is much better. When I suggested it, it was rejected because D is too concerned about breaking existing code. Which is a not-very-good argument since this breaking change is concervative (you only have to add "consume" or something similar when the compiler complains).

The obvious solution is to do as you suggest and in addition do all @safe analysis on a high level IR layer using dataflow through and through.

Instead D continues down the rather flimsy path of partially addressing these issues in the type system… which will lead to a more complicated and less complete solution where @safe basically continues to be a leaky cauldron…
December 06, 2014
On Saturday, 6 December 2014 at 10:59:24 UTC, Daniel N wrote:
> On Friday, 5 December 2014 at 16:48:45 UTC, Marc Schütz wrote:
>> There are limitations this proposal has in comparison to my original one. These limitations might of course be harmless and play no role in practice, but on the other hand, they may, so I think it's good to list them here.
>>
>
> One concern I had with your proposal was that it refers to a symbol before it's available.
>
> scope!haystack(string) findSubstring(scope(string) haystack, scope(string) needle)
>
> C++11 had similar issues and they solved it by introducing trailing return types.
>
> ex:
> "template<class T>
> auto mul(T a, T b) -> decltype(a*b)"
>
> Personally I would prefer not to go down that lane and DIP69 avoids that problem.

For D, this wouldn't be necessary, because parsing and semantic analysis are strictly separated. The owners would only have to be evaluated very late during the semantic phase.

But let's see how DIP69 works out...
December 06, 2014
On 04/12/2014 21:23, H. S. Teoh via Digitalmars-d wrote:
> 	@property scope ref T borrow() { return t; }
> 	alias borrow this;

While this DIP enabling the above to be memory-safe is awesome, a later tweak to AliasThis grammar to allow storage classes could make 'borrow' redundant. Plain alias now allows storage classes:

AliasDeclaration:
    alias StorageClassesopt BasicType Declarator ;

So:

TweakedAliasThis:
    alias StorageClassesopt Identifier this ;

Given ref and now scope will be storage classes, that would then allow just:

alias scope ref t this;