July 31, 2014
On 07/31/2014 09:19 PM, Walter Bright wrote:
> On 7/31/2014 6:37 AM, Daniel Gibson wrote:
>>> The behavior you desire here is easily handled by creating your own
>>> template to exhibit it. See the implementation of enforce() for how to
>>> do it.
>>>
>>
>> Now that's a bit surprising to me, as you wrote in the other thread:
>>  > 7. using enforce() to check for program bugs is utterly wrong.
>>  > enforce() is a library creation, the core language does not recognize
>>  > it."
>>
>> Until now (being mostly a C/C++ guy), I saw assertions as a way to
>> find bugs
>> during development/testing that is completely eliminated in release
>> mode (so I
>> might still want to handle the asserted conditions gracefully there) -
>> and it
>> seems like I'm not alone with that view.
>
> The assert will not cause code to be generated to test the assertion at
> runtime with -release. But the assert must still be valid, and since it
> is valid, the optimizer should be able to take advantage of it.
>
> assert is not meant to be used as "I want my program to be valid even if
> the assert is false".
> ...

An assert does not have any bearing on program validity if assertions are disabled.

>> Because (in C/C++) assertions just vanish when NDEBUG isn't defined,
>> it would
>> never occur to me that they have any influence on such (release mode)
>> builds.
>> Doing optimizations on this would just lead to frustration, like
>> several kinds
>> of optimizations recently have (e.g. this "compiler will remove
>> bzero() on
>> memory that isn't used afterwards" bullshit).
>
> That's the way assert in C/C++ conventionally worked.

Right. Note 'conventionally' as in 'convention'. Why break the convention?

> But this is
> changing. Bearophile's reference made it clear that Microsoft C++ 2013
> has already changed,

No way, their assert macro conforms to the standard. http://msdn.microsoft.com/en-us/library/9sb57dw4.aspx

> and I've seen discussions for doing the same with gcc and clang.
> ...

And I've seen discussions for introducing silent breaking changes into D. It usually didn't happen. Why are those discussions deemed to be _more relevant than the standard_ in the first place?

> In fact, the whole reason assert is a core language feature rather than
> a library notion is I was anticipating making use of assert for
> optimization hints.

I was under the impression it was because of assert(0).
July 31, 2014
On 07/31/2014 09:11 PM, Walter Bright wrote:
> On 7/31/2014 7:51 AM, Tofu Ninja wrote:
>>> For example, you can have a sort function, and then at the end assert
>>> that the
>>> output of the function is sorted.
>>
>> But that is verifying that the input is sort-able....
>
> Integers are sortable, period. That is not "input".
> ...

Data types with opCmp may not be. (In fact, yours often aren't, because the subtraction trick does not actually work.)
The type and its opCmp are "inputs" to the 'sort' template. If it asserts unconditionally after sorting, some instantiations might be buggy.

>
>> All I am saying is that the idea that assert should not be used to
>> verify input
>> makes no sense at all. Every program takes in input and once a little
>> bit is in,
>> anything derived from that input is also input.
>
> You're denying the existence of mathematical identities applied to
> symbolic math.
>
>
>> Also this thread has made me firmly never want to trust assert
>> again... I have
>> actually been going though my projects and removing them now because I
>> don't
>> trust them any more...
>
> I suggest revisiting the notion of program logic correctness vs input
> verification.

You are denying the existence of standard terminology in logic and programming languages.
August 01, 2014
in/out contracts ought to remain in release builds.

Since debug and non-zero-assert disappear in release, I can just use them inside the in/out blocks when I want debug-only verification steps.

Compiling away contracts seems redundant, and it would help reduce noise to push more verification logic out of function bodies and into contracts.
August 01, 2014
On 7/31/2014 2:07 PM, Tofu Ninja wrote:
> On Thursday, 31 July 2014 at 20:24:09 UTC, Walter Bright wrote:
>> On 7/31/2014 4:36 AM, bearophile wrote:
>>> (The problem is that your have defined your own idea,
>>
>> "My" idea is the conventional one for assert - see the Wikipedia entry on it.
>
> That entry makes no mention of assert being used as an optimization hint.
>

Saying that a predicate is always true means it's available to the optimizer.
August 01, 2014
On 7/31/2014 1:36 PM, Tofu Ninja wrote:
> On Thursday, 31 July 2014 at 19:12:04 UTC, Walter Bright wrote:
>> Integers are sortable, period. That is not "input".
>
> Ok so sorted ints are not "input", what else is not "input"? Where can I draw
> the line? And if I do use assert on an input, that is what? Undefined? I thought
> D was not supposed to have undefined behavior.

I've answered this so many times now, I no longer have any new words to say on the topic.

August 01, 2014
On 7/31/2014 12:37 PM, Daniel Gibson wrote:
> Am 31.07.2014 21:19, schrieb Walter Bright:
>> That's the way assert in C/C++ conventionally worked. But this is
>> changing. Bearophile's reference made it clear that Microsoft C++ 2013
>> has already changed, and I've seen discussions for doing the same with
>> gcc and clang.
>
> This will break so much code :-/

Maybe you're right. I've always thought that assert() was a simple and obvious concept, and am genuinely surprised at all the varied (and even weird) interpretations of it expressed here. I've never heard such things in my 30+ years of using assert().

I have seen some misuse of assert() before, but the author of them knew he was misusing it and didn't have an issue with that.

I can't even get across the notion of what a program's inputs are. Sheesh.

Or perhaps some people are just being argumentative. I can't really tell. I can tell, however, than many of these sub-threads have ceased to contain any useful discussion.



>> In fact, the whole reason assert is a core language feature rather than
>> a library notion is I was anticipating making use of assert for
>> optimization hints.
>
> So why is this not documented?

Frankly, it never occurred to me that it wasn't obvious. When something is ASSERTED to be true, then it is available to the optimizer. After all, that is what optimizers do - rewrite code into a mathematically equivalent form that is provably the same (but cheaper to compute). Its inputs are things that are known to be true.

For example, if a piece of code ASSERTS that x is 3, thereafter the optimizer knows that x must be 3. After all, if the optimizer encounters:

   x = 3;

do I need to then add a note saying the optimizer can now make use of that fact afterwards? The use of "assert" is a very strong word, it is not "maybe" or "possibly" or "sometimes" or "sort of".

When you write:

   assert(x == 3);

then at that point, if x is not 3, then the program is irretrievably, irredeemably, undeniably broken, and the default action is that the program is terminated.

The idea expressed here by more than one that this is not the case in their code is astonishing to me.


> The assert documentation isn't even clear on when an AssertError is thrown and
> when execution is halted (http://dlang.org/contracts.html doesn't mention halt
> at all).
> And it says "When compiling for release, the assert code is not generated." - if
> you implemented assert like this so the optimizer can assume the assertion to be
> true even when "assert code is not generated" this is certainly something
> developers should be very aware of - and not once you actually implement that
> optimization, but from day one, so they can use assert accordingly!

I agree it is now clear that the documentation needs to be clarified on that point. What I considered as obvious apparently is not.


> This thread however shows that many D users (who probably have more D experience
> than myself) are not aware that assert() may influence optimization and would
> prefer to have separate syntax to tell the optimizer what values he can expect.

It's also true that many D users are unclear what data flow analysis is, how it is used in compilers, and the state of the art of such optimizations. And they shouldn't need to. D would be a terrible language indeed if only compiler guys could write successful programs with it.

What users do need to understand, is if they write:

    assert(x < 10);

that at that point in the code, they are GUARANTEEING that x is less than 10, regardless of compiler switches, checked or not, and that the program is absolutely broken if it is not.

And let the compiler take it from there.

August 01, 2014
On 7/31/2014 12:46 PM, H. S. Teoh via Digitalmars-d wrote:
> AFAIK, the compiler currently doesn't use assert as a source of
> information for optimization, it's just being proposed.

It already does by default, the proposal is to do it for -release.

August 01, 2014
On 7/31/2014 1:11 PM, Daniel Gibson wrote:
> I guess in many cases I'd avoid using assert() in fear of breaking my
> defensively written program (like that example from earlier: assert(x !is null);
> if(x) { x.foo = 42; }).

I just hang my head in my hands at that code :-)

Really, you're much better off writing something like:

   if (x) {
      x.foo = 42;
   }
   else {
      writeln("Sack the QA dept. for not testing this code");
      abort();
   }

which expresses what you intend directly and would be perfectly fine.


> Somehow all this assumes that you found all problems (at least the ones you
> check in assert()) by testing during development..
> *I* personally wouldn't be so arrogant to assume that my code is bugfree just
> because it passed testing...

No worries, I'd probably spare you and sack the fellow on your left!

August 01, 2014
On 7/31/2014 12:48 PM, bearophile wrote:
> I am against the idea of turning asserts into a dangerous assert-assume hybrid.
> And you have ignored some answers by Timon and others.

Every post I answer spawns 5 more responses. It's exponential. At some point, I have to give up. I've spent 2 days on this.
August 01, 2014
On 7/31/2014 3:23 AM, Tobias Müller wrote:
> Walter Bright <newshound2@digitalmars.com> wrote:
>> On 7/30/2014 10:16 PM, bearophile wrote:
>>> But you have redefined "assert" to mean a mix of assume and assert.
>>
>> I haven't redefined anything. It's always been that way. It's used that
>> way in C/C++ (see your Microsoft C++ link).
>
> Actually I cannot find anything in (the latest draft of) the C standard
> that would allow that. There's no mention of undefined behavior if an
> assertion is not met with NDEBUG defined. It's clearly defined what the
> macro should expnd to in that case.

Yes, you are correct about that. But I suspect that is an artifact of the C standard, and its long history of compilers that did essentially no optimization at all (my C compiler was the first DOS C compiler to do data flow analysis). The C standard was written long after C was ubiquitous, and tended to simply document what compilers did.

If you look at the Wikipedia article,

  http://en.wikipedia.org/wiki/Assertion_(software_development)

you'll see a more high level view of what assert is all about, rather than a worm's eye view the C standard takes. (It even uses C for its examples.) Up until this thread, I've never encountered someone who thought differently about it.

You might also want to consider Hoare's seminal paper on the topic published in the 1970's, which my view adheres to. Then there's the extensive literature on asserts for contract programming, again with the same view of asserts.

Consider also Meyers' comprehensive tome on the topic "Object-Oriented Software Construction" (1997) where he writes:

"A run-time assertion violation is the manifestation of a bug in the software."

    -- pg. 346

There isn't any weasel-wording there, there is no hint of an assert not actually having to be valid if the program is optimized, etc.