|Posted by Daniel Lewis||PermalinkReply|
Robert Fraser Wrote:
> > Dan wrote:
> >> So we can:
> >> - fearlessly run parallel functions over the same data
> >> - cache something and expect it to be correct indefinitely
> >> - trust our libraries, and 3rd parties not to mangle our stuff
> >> - catch accidental writes to something we wanted read-only
> >> - make guarantees to clients and employers (our code *cannot* change
> >> your x)
> >> - be aware the opposite is probably true for non-invariants
> >> In all honesty though, we must know it's a bluff. The code *can* change out from under you. It just won't by your own hand.
*someone else said*:
> > So you're saying the likely usage patterns will involve giving the compiler "hints" in the form of casts to invariant on data that actually isn't?
> > --bb
> I think he's trying to say that it's possible to cast away invariant.
When we program, we are making the assumption that data doesn't arbitrarily change between steps in our algorithm "deux est machina". If it does, most often the algorithm will fail to perform the desired function.
What "invariant" does, is to define this assumption, and clarify which parts we're going to decide we'll accept changing out from under us, and which parts aren't allowed to.
The parts that aren't allowed to, are asserted by the compiler; but that by no means makes the program's outcome certain. Just because your own code cannot tamper with it, doesn't mean that it can't be tampered with in a VM, by a debugger, or whatnot; and so the assumption is known to be false for at least some case. Hence we declare the results "undefined" and be done with it.
This is honesty.
Now, because we declare awareness of this assumption, we are able to cause the program to make some optimizations and restrictions which may do the things I stated in the above list.
Hope that clarifies my view on it.