March 08, 2009
Burton Radons wrote:
> Obviously I made it contrived so that it's as clear as possible what
> the issue is. In reality, it will be going through more layers.
> Here's one layer:
> 
> int [] a = new int [1]; a [0] = 1;
> 
> invariant (int) [] func (invariant (int) [] a) { return a; }
> 
> auto b = func (cast (invariant (int) []) a);
> 
> Notice this has the same pattern as std.string.replace; that's why I
> did that cast.
> 
> a [0] += b [0]; a [0] += b [0]; writef ("%s\n", a [0]); // Not
> optimised: 4. // Assuming b cannot be modified: 3.
> 
> When this actually crops up in bugs the reality will be far more
> complex and practically impossible to discover.


The solution is, when you cast things to invariant, that's a one way ticket. You cannot continue to mutate the original. If you find you need to, the data must be duplicated, and the duplicate made invariant. If this is done a lot, it's time to refactor the program.

Casts are greppable, and should be reviewed anyway. In this case, casting an array to immutable should be followed by NO further references to the mutable version.
March 08, 2009
On Sun, 08 Mar 2009 01:19:15 -0800, Walter Bright wrote:

> Derek Parnell wrote:
>> Walter, you have side-stepped the problem in question by talking about a totally different problem.
> 
> It's the same issue. When you use a cast, you are subverting the type system. That means you have to be sure you are doing it right. The compiler cannot help you.

I disagree. There are two issues being discussed. One by Burton and another
by yourself. Burton is showing how that data declared as immutable can be
modified. You are talking about the dangers of using casts. NOT THE SAME
ISSUE.

>> Burtons code says "b is invariant", but the program allows it to be changed. Your code does NOT say that any of those variables is invariant. The problem is NOT with the cast (although that is a totally different issue). The problem is that the code says "invariant" but the data gets changed anyhow. The method of changing the data is not the issue. The issue is adxthat is gets changed at all.
> 
> When you cast something to immutable, you can no longer change it. It's a one-way ticket.

Walter, did you actually see and run that code? I did cast something to immutable but it got changed anyway. Is this a compiler bug or what? You say that if one casts something to immutable that therefore one can no longer change it - but it DID get changed.

I know that a better way to code this example would have been to use the .idup functionality, but that is not the point. I relied on the compiler ensuring that everything declared as immutable would not be modified. The compiler failed.

-- 
Derek Parnell
Melbourne, Australia
skype: derek.j.parnell
March 08, 2009
Burton Radons wrote:
>    int [] a = new int [1];
> 
>    a [0] = 1;
> 
>    auto b = cast (invariant (int) []) a;
> 
>    a [0] += b [0];
>    a [0] += b [0];
>    writef ("%s\n", a [0]);
>    // Normal result: 4.
>    // Optimiser which assumes invariant data can't change: 3
> 
> Yes, the code is an abuse of the const system. THAT'S EXACTLY MY POINT. Casting mutable data to invariant leads to situations like these. Only data which will never change can be made invariant. Putting "alias invariant (char) [] string" in object.d induces these situations and makes it seem like it's a good idea.

You're going into undefined territory and complain that it doesn't work as you expect. Perhaps that should issue a warning, but you're doing something wrong and bad: you're saying that the same array is both mutable and immutable.

Think of the other approach: once you cast an array to invariant, the compiler finds all aliases of that array and turns them invariant. You'd be even more upset in that case. It would have long-reaching effects that are hard to track down.

Or you could forbid the cast. But it's a useful cast, and you really can't get rid of it if you ever want to convert something that is mutable to something that is invariant without copying.
March 08, 2009
Derek Parnell escribió:
> On Sat, 07 Mar 2009 23:40:52 -0800, Walter Bright wrote:
> 
>> I'm still not understanding you, because this is a contrived example that I cannot see the point of nor can I see where it would be legitimately used.
> 
> I can see Burton's concern, and I'm very surprised that the compiler allows
> this to happen. Here is a slightly more explicit version of Burton's code.
> 
> import std.stdio;
> void main()
> {
>   int [] a = new int [1];
> 
>    a [0] = 1;
> 
>    invariant (int) [] b = cast (invariant (int) []) a;
> 
>    writef ("a=%s b=%s\n", a [0], b[0]);
>    a [0] += b [0];
>    writef ("a=%s b=%s\n", a [0], b[0]);
>    a [0] += b [0];
>    writef ("a=%s b=%s\n", a [0], b[0]);
> }
>    The problem is that we have declared 'b' as invariant, but the program is
> allowed to change it. That is the issue.

As I see it, the cast should fail saying "can't cast a to invariant because a is mutable", because otherwise you are producing a behaviour that doesn't match what the code says.

In:

char c;
int* p = cast(int*)&c;
*p = 5;

there's nothing wrong about the cast because a char can be viewed as an int. But a mutable data cannot be seen as immutable.

Walter, you say "Whenever you use a cast, the onus is on the programmer to know what they are doing." That's true. But if the programmer made a mistake and she wasn't knowing what she was doing, the compiler or the runtime should yell.

For example:

class A { }

class B : A { }

class C : A { }

void foo(A a) {
  C c = cast(C) a;
}

The programmer is sure "a" is really of type "C". If it is, ok, everything works. Now, assuming it is not, this yields a null "c" which fails at runtime if "c" is used (which will be used eventually because that's why we are casting it). The failure is a big one: the program halts. (well, I'd prefer an exception to be thrown, but that's another issue). Now, if you allow a cast from mutable to immutable to continue, the error at runtime will be very subtle and really hard to find, because the program will continue to work but with a wrong behaviour.
March 08, 2009
Christopher Wright Wrote:

> Burton Radons wrote:
> >    int [] a = new int [1];
> > 
> >    a [0] = 1;
> > 
> >    auto b = cast (invariant (int) []) a;
> > 
> >    a [0] += b [0];
> >    a [0] += b [0];
> >    writef ("%s\n", a [0]);
> >    // Normal result: 4.
> >    // Optimiser which assumes invariant data can't change: 3
> > 
> > Yes, the code is an abuse of the const system. THAT'S EXACTLY MY POINT. Casting mutable data to invariant leads to situations like these. Only data which will never change can be made invariant. Putting "alias invariant (char) [] string" in object.d induces these situations and makes it seem like it's a good idea.
> 
> You're going into undefined territory and complain that it doesn't work as you expect. Perhaps that should issue a warning, but you're doing something wrong and bad: you're saying that the same array is both mutable and immutable.
> 
> Think of the other approach: once you cast an array to invariant, the compiler finds all aliases of that array and turns them invariant. You'd be even more upset in that case. It would have long-reaching effects that are hard to track down.
> 
> Or you could forbid the cast. But it's a useful cast, and you really can't get rid of it if you ever want to convert something that is mutable to something that is invariant without copying.

A cast could be avoided if the compiler could track unique mutable references. Then assignment to invariant could be done implicitly and make the mutable reference no longer exist. This would require escape analysis for mutable references. I like allowing implicit invariant casts, but I seem to be in the minority. Doing that brings further language complexity.
March 08, 2009
Derek Parnell wrote:
> On Sun, 08 Mar 2009 01:19:15 -0800, Walter Bright wrote:
> 
>> Derek Parnell wrote:
>>> Walter, you have side-stepped the problem in question by talking about a
>>> totally different problem.
>> It's the same issue. When you use a cast, you are subverting the type system. That means you have to be sure you are doing it right. The compiler cannot help you.
> 
> I disagree. There are two issues being discussed. One by Burton and another
> by yourself. Burton is showing how that data declared as immutable can be
> modified. You are talking about the dangers of using casts. NOT THE SAME
> ISSUE. 

It is THE VERY SAME ISSUE.

Andrei
March 08, 2009
Walter Bright wrote:
> Derek Parnell wrote:
>> import std.stdio;
>> void main()
>> {
>>   int [] a = new int [1];
>>
>>    a [0] = 1;
>>
>>    invariant (int) [] b = cast (invariant (int) []) a;
>>
>>    writef ("a=%s b=%s\n", a [0], b[0]);
>>    a [0] += b [0];
>>    writef ("a=%s b=%s\n", a [0], b[0]);
>>    a [0] += b [0];
>>    writef ("a=%s b=%s\n", a [0], b[0]);
>> }
>>    The problem is that we have declared 'b' as invariant, but the program is
>> allowed to change it. That is the issue.
>>
> 
> The following also compiles:
> 
> char c;
> int* p = cast(int*)&c;
> *p = 5;
> 
> and is clearly buggy code. Whenever you use a cast, the onus is on the programmer to know what they are doing. The cast is an escape from the typing system.

Sometimes a cast can break a program, sometimes a cast is absolutely harmless:

int w, h;
double ratio = (cast(double)w) / h;

Sure you don't want to introduce something like dangerous_cast() if a cast dangerous, because it e.g. reinterprets raw memory bytes, or aliases immutable memory to mutable memory?
March 09, 2009
Sun, 08 Mar 2009 13:01:27 -0200, Ary Borenszweig wrote:

> As I see it, the cast should fail saying "can't cast a to invariant because a is mutable", because otherwise you are producing a behaviour that doesn't match what the code says.
> 
> In:
> 
> char c;
> int* p = cast(int*)&c;
> *p = 5;
> 
> there's nothing wrong about the cast because a char can be viewed as an int. But a mutable data cannot be seen as immutable.

It's not interpreting char as int.  It's interpreting c and part of return address as an int on x86 essentially corrupting stack.  Or gets an unaligned access hardware exception.  Or whatever.  It's anything but a correct code.  It's exactly the same issue as casting mutable to immutable: you're abusing the power the compiler gave you.

Though I agree that the line between safe and unsafe casts is rather subtle.
March 09, 2009
Derek Parnell wrote:
> I know that a better way to code this example would have been to use the
> .idup functionality, but that is not the point. I relied on the compiler
> ensuring that everything declared as immutable would not be modified. The
> compiler failed.

It is the same issue. When you use a cast, you are *explicitly* defeating the language's type checking ability. It means that the onus is on the one doing the cast to get it right.
March 09, 2009
On Sun, 08 Mar 2009 17:24:09 -0700, Walter Bright wrote:

> Derek Parnell wrote:
>> I know that a better way to code this example would have been to use the .idup functionality, but that is not the point. I relied on the compiler ensuring that everything declared as immutable would not be modified. The compiler failed.
> 
> It is the same issue. When you use a cast, you are *explicitly* defeating the language's type checking ability. It means that the onus is on the one doing the cast to get it right.

Except when you want invariant data, then cast is *required*.  At that point, it is a language feature, not a defeat of the typesystem.  I think there is some merit to the arguments presented in this thread, but I don't think the answer is to get rid of invariant.  Perhaps make the compiler more strict when creating invariant data?  I liked the ideas that people presented about having unique mutable references (before and in this thread).  This might even be solvable in a library given all the advances in structs.

So it's not exactly the same issue, because in one you are doing something totally useless and stupid.  And in the other, it is a language *requirement* to use casting to get invariant data.  However, in both cases, the onus is on the developer, which sucks in the latter case...

Walter: Use invariant when you can, it's the best!
User: ok, how do I use it?
Walter: You need to cast mutable data to invariant, but it's on you to
make sure nobody changes the original mutable data.  Casting circumvents
the typesystem, so the compiler can't help you.
User: :(

-Steve