February 07, 2006
Matthew wrote:
> "Sean Kelly" <sean@f4.ca> wrote in message
> news:ds927o$156u$1@digitaldaemon.com...
>> I'm not sure I see the problem.  Since the warnings mode is optional,
>> what's to stop you from building with warnings until you fix everything
>> it flags that you care about then turn the option off and build to test?
>>
>> My only issue is that with "warnings as errors" as the only
>> warnings-enabled option, there must be a code-level way to fix anything
>> the compiler warns about.  Otherwise, shops that build with warnings
>> enabled as the release mode would be SOL if the compiler warns about
>> something intended.
>>
>> This is typically how I do things with VC++: require no messages at WL:4
>> and preprocessor directives to situationally disable any warnings about
>> utterly stupid things (like the "standard function is deprecated"
>> business in VC 2005).  Things like implicit narrowing conversions are
>> flagged by the compiler and casts are added where appropriate, etc.  To
>> me, fixing warnings is mostly a matter of either fixing the bug or
>> telling the compiler "yes, I meant to do that."  I can't think of a
>> situation where I'd want to ship with some warning messages still
>> displaying, assuming I cared enough to enable them.
> 
> You're replied on my last post, but none of this seems to address anything I
> raised. Did you reply on the wrong part?

No, but perhaps this was somewhat of a tangent.  I was mostly suggesting that you compile with warnings on to ensure no new errors are introduced, and if you still can't produce a linkable object for some reason then turn warnings off and make sure the run-time bug still exists.  Warnings are optional, after all.

> For the record, I'm not talking about shipping with warnings still firing.
> That's madness to do, and I suspect my opinion on that, and practice in not
> doing so, to be identical to your own.

I actually do release builds with warnings enabled for shipping applications, but the warnings I'm used to are compile-time ones.  If D has run-time warnings, that's new to me (I've never actually used "-w" in D--shame on me).  To me, the benefit of warnings is to point out potential coding mistakes that are syntactically valid--implicit narrowing conversions being the big one IMO--and to do so at compile-time.

> Anyway, my entire point is tightly focused on the contradiction between
> Walter's stock technique for debugging, which he advises all comers to adopt
> whenever they report a problem, and the restrictions imposed by the DMD
> compiler that directly defeat that advice. I await that point being
> addressed, rather than some deflectory digression, with exteme patience.

I apologize.  I think I'm simply missing the point.


Sean
February 07, 2006
Matthew wrote:
> "Don Clugston" <dac@nospam.com.au> wrote in message
> news:dsa2t0$1vhn$1@digitaldaemon.com...
> 
>> That is, you enable warnings so that you can see what assumptions the
>> compiler is making. You add an early return statement in order to
>> quickly simplify the situation, with minimal side-effects. When you do
>> this, you want to see how the compiler's assumptions have changed.
>> But unfortunately, the warnings-as-errors makes this impossible to do.
>>
>> Right?
> 
> Pretty much.

Oh I see.  I wasn't aware of the dead code warning.  Your complaint makes a lot more sense now.

> D continues to have a foot firmly in each of the two idiotic camps of "the
> compiler knows better than the user" (so you can't choose to generate object
> code with something containg warnings however much you know you need/want
> to, and so on)

I agree that it should be possible to disable the aforementioned warning, as I occasionally do the same thing.  Still, there's nothing to stop you from disabling warnings altogether if those are the only ones you're seeing.


Sean
February 07, 2006
Derek Parnell wrote:
> On Mon, 06 Feb 2006 07:34:37 +1100, Walter Bright <newshound@digitalmars.com> wrote:
> 
>>
>> "Derek Parnell" <derek@psych.ward> wrote in message
>> news:op.s4imylg86b8z09@ginger.vic.bigpond.net.au...
>>> As far as Walter is concerned, a 'warning' is really an error too. So yes,
>>> DMD will display warnings but treat them as errors such that the compiler
>>> doesn't generate code.
>>
>> That's right. If the programmer cares about warnings, he'll want them to be
>> treated as errors. Otherwise, the warning message could scroll up and all
>> too easilly be overlooked.
> 
> However, not everyone thinks like Walter. I, for example, regard a warning as different from an error. A warning tells me about a issue that the compiler has had to make an assumption in order to generate code. An error tells me about an issue that the compiler has decided that there are no assumptions that can be made and thus it can't generate valid code.
> 
> A warning alerts me to the fact that the compiler has made an assumption on my behalf and thus I might want to do something to avoid a 'wrong' compiler assumption. These assumptions are not, per se, errors. Thus if "-w" is not used the compiler generates code and so I can't see why the compiler still can't generate code when "-w" is used. All I want to know is where the compiler has assumed things I didn't actually intend.
> 
> --Derek Parnell
> Melbourne, Australia

I too agree with this view, and we have voiced these concerns before (recall news://news.digitalmars.com:119/dn5en0$a4j$1@digitaldaemon.com or http://www.digitalmars.com/d/archives/digitalmars/D/bugs/5743.html ).

So let me just make some comments in light of Walter's comments.

Why we want warnings: We want warnings so we can be helped with notifications of possible code segments that are incorrect.
Why don't we want halt-compilation-on-warnings (as the only option available): Because during the development process it is quite expected that in a given moment, many parts of the code will be incorrect, and the program as whole will be incomplete, but we want to be able to execute it, as it is nonetheless still meaningfull to do so.
There may be those certain parts of code that we *know* are incorrect, and we don't care (at the moment) about warnings that come from there, but, we care about warnings that come from *another* part of the code. Having warnings disabled does not give the warnings about the part of the code we want, and having warnings-as-errors, does not allow one to compile the program without fixing the error in the part of the code we do not care about.
One example of this situation is the mentioned test-and-chop method, which I too used often. It is quite common when chopping (or perhaps just commenting, that's what I do), that a certain segment of code now has an unused variable or an invalid code path, nonetheless the whole program structure is still meaningfully executable. When I'm stubbing out a certain part AA of code for testing purposes, I don't want to have to change another part BB of code (and then later change BB back when AA is fixed or stubbed in) just because that other part BB of the code annoys the compiler, even though I know the program is meaningful with part BB unchanged. It renders the code slightly less agile (i.e., harder to change).

One can say: oh but then you can disable warnings-as-errors when you're doing the test chop runs, and enable it when your doing 'non-chop' development. But that is plain idiotic. I'm not going to change my build config files every time I wanna change from on mode to another. In fact, since different parts can be under work by different people, this idea is just not considerable.

I find that this situation/rationale reminds me a lot of the checked exceptions issue (although not as worse). People imagine a clean, pristine utopian world were code always compiles without warnings [the checked throwing of exceptions]. Yet the reality/practice doesn't work that way, and people are annoyed by the forcing of the pristine idea. So people either suck up with the extra agile-ness, like cleaning warnings on all compiles [writing extra try/catch/throws] or find workarounds, like disabling warnings at all [wrapping normal exceptions in unchecked exceptions].


Walter Bright wrote:
> That's right. If the programmer cares about warnings, he'll want them to be
> treated as errors. Otherwise, the warning message could scroll up and all
> too easilly be overlooked.
>
>
I think the days of "scrolling up" are over. It's in-editor error and warning reporting now.


Walter Bright wrote:
>
>> I quite sure that other people can also quote from their C++ experience
>> with instances to the contrary -
>
> If there's a compelling case to be made, bring it on <g>.
>

I hope I made one. And it is confirmed in my pratical coding experience, both in C++ and Java & C#.


Walter Bright wrote:
> Think of them as "optional errors" instead of warnings <g>.
>
Then can we (by 'we' I mean DMD and the docs) at least not call them warnings ? Because they are not. The option could be called strict, pedantic, or something like that, just not 'warnings'. That alone would alleviate a great deal of my contention with this issue.


-- 
Bruno Medeiros - CS/E student
"Certain aspects of D are a pathway to many abilities some consider to be... unnatural."
February 07, 2006
"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:dsabfk$282a$1@digitaldaemon.com...

>>Walter wrote:
>>> Your point sounded like you were
>>>worried about introducing bugs during chop-and-test. I recommended
>>> making a copy first, then no bugs can be introduced.
> It ignores the clear and very important issue that one might want to
> continue to have warnings emitted, in order to keep the "mental picture"
> of
> the compilation output consistent, in order to ensure that any changes to
> try and stop the compiler from its ICEs, e.g. reordering seemingly
> unrelated
> statements, have not done anything that's a _new_ problem, in order to see
> if a commented out block highlights a new warning, and so on and so forth.

So I summed up your point accurately.

> Of course, Walter can counter that each and every change should be atomic,

Nope. The only goal with the chop is "does the problem I am trying to isolate remain, or does it go away?" It's binary, yes or no. If you're introducing all kinds of other considerations, that's likely why you find chop-and-test to be so unusably arduous.

> but it's not living in the real world, or at least not in my real world. I want to work how I work best, not how Walter works best.

You often send me very large bug reports with a comment that you've been unable to chop it down further and that you've no idea what's going wrong. Every one of them I've reduced, using chop-and-test, to 10 lines or so after a few minutes. None of them hinged on -w in any way or required more than 2 of copies of any file. Your way isn't working (in the sense that it isn't finding the cause of the problem), why is it best?

This would be probably helped a lot if I could look over your shoulder while you're doing chop-and-test, so I could show you how to do it.


February 07, 2006
"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9tq1$1r8n$1@digitaldaemon.com...
> I'd love to be able to poo-poo your statements as laughable fantasy, but I have no doubt that you believe what you're saying and that you're speaking the truth about your own experiences. What continues to sadden me is the apparent narrow-mindedness in that you project a certain belief that your experiences are enough of a sample to make pronouncements and direct tool strategy for everyone else.
>
> I can tell you that there are plenty of times when a binary approach such
> as
> you describe has not proved enough for me, perhaps as a consequence of my
> working with non-trivial C++ code (templates, etc.). There are often
> instances where a combinatorial effect on the compiler cannot be reduced
> in
> the manner you describe, and many more where the effort involved to bring
> everything in one file is far more than necessary compared with modifying
> several in parallel.
> I know you don't believe me about that, but that
> doesn't make it not so.

I know that you find this to not work, and you've often sent me those cases. None of them resisted reduction to a few lines.

I've been working on solving template related bugs for 10 years :-(. At the end of the process, each and every one of the minimal examples goes into the DMC++ test suite, so I have quite an extensive test suite (maybe a thousand examples just for templates). Every one was reduced using the process I described, and they are all just a few lines of code.

And, as a pretty general rule, none of the problems made any sense until they were so reduced, and they often did start out as incomprehensibly complex templates.

There is a very solid factual basis for what I said works in isolating problems. The proof is not only in the DMC++ (and DMD) test suite, but in the irreducible cases you sent me that I reduced and sent back. If you want to dismiss such a successful track record, including success on your cases, as laughable fantasy, what can I say?

P.S. I might uselessly add that none of this was aided or hindered by -w.


February 07, 2006
"John Demme" <me@teqdruid.com> wrote in message news:dsaaab$26tt$1@digitaldaemon.com...
> Walter Bright wrote:
>> Each warning has an associated source code modification that will get rid of it.
> I assume you mean the user's source code- but what if the user's code is right?  Or more importantly, not right- but how they want it.  The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors.

This goes back to why having warnings at all is a bad idea, as it implies a wishy-washy language specification. Is code legal or is it not?


February 07, 2006
"Bruno Medeiros" <daiphoenixNO@SPAMlycos.com> wrote in message news:dsaijo$2f0r$1@digitaldaemon.com...
> One can say: oh but then you can disable warnings-as-errors when you're
> doing the test chop runs, and enable it when your doing 'non-chop'
> development. But that is plain idiotic. I'm not going to change my build
> config files every time I wanna change from on mode to another.
> In fact, since different parts can be under work by different people, this
> idea is just not considerable.

Chop-and-test is not normal development. It is done with a local copy of the project, including the makefiles, not code that you or other people are working on.

The whole technique of chop-and-copy implies being able to hack and slash away at the project source with abandon. You don't worry about breaking the build, warnings, introducing bugs, or screwing up someone else's part. This is all possible because one is working with a local copy that will be deleted as soon as the problem is found and a solution identified. The solution is then folded into the real development branch.


February 07, 2006
"Walter Bright" <newshound@digitalmars.com> wrote in message news:dsapis$2mfo$1@digitaldaemon.com...

> This goes back to why having warnings at all is a bad idea, as it implies a wishy-washy language specification. Is code legal or is it not?

Are you talking about a real-world PL which is evolving
over time and where eventual syntax changes are
deemed desirable or even necessary?

Or are you talking about a perfect and therefore fictional language which is released as V1.0 and stays there and shines forever? - Then I'd have less troubles to agree.

The second scenario does not apply to D because it
seems to be more the evolving type - and I am glad
for it. So you might consider to keep issuing warnings
for several reasons. I give you 2 examples:

1) Let us - just theoreticaly - assume that at one stage
you want to disallow the optional C-like syntax of declaring
arrays. If such a major move is going to happen in the not
so near future you would be well advised to introduce
warnings in several compiler releases instead of moving
radically and forcing even newbies to search for an
"allow deprecated features" switch to get their job done.

2) One feature, which usually prevents otherwise
unnecessary debugging sessions, is, to offer a warning
to the less-than-perfect user about declared but
otherwise unused variables. They might be here
in error or are intentionally inserted to serve a
future purpose. Your compiler has no way to know
but it could politely issue a warning, which would
be far better than ignoring this fact or refusing
to compile alltogether.



February 08, 2006
On Mon, 6 Feb 2006 21:55:56 -0800, Walter Bright wrote:

> "Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9b89$1bq2$1@digitaldaemon.com...
>>> Or copy the file(s) involved and chop the cop(y|ies).
>> Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task.
> 
> ??
> 
>> To have to be making potentially 10s or even 100s of
>> copies of a given file,
> 
> No need for more than two. I have to wonder what you are doing.
> 
>> and keeping track of what changes go where,
> 
> No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop something else. No need to keep a 'stack' of more than one change.
> 
>> is going to turn it into a really time-consuming and horrible arduous task.
> 
> If it is, you're going about it inefficiently.
> 
> I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup copy back over the editted copy. Nothing arduous.

I'm backing Walter's case here. Although doing the chop-retest can be boring to do, there is no need to make it hard to do and it always seems to give you a small code example of the bug.

-- 
Derek
(skype: derek.j.parnell)
Melbourne, Australia
"Down with mediocracy!"
8/02/2006 11:56:49 AM
February 08, 2006
On Tue, 7 Feb 2006 10:04:54 -0800, Walter Bright wrote:

> "John Demme" <me@teqdruid.com> wrote in message news:dsaaab$26tt$1@digitaldaemon.com...
>> Walter Bright wrote:
>>> Each warning has an associated source code modification that will get rid of it.
>> I assume you mean the user's source code- but what if the user's code is right?  Or more importantly, not right- but how they want it.  The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors.
> 
> This goes back to why having warnings at all is a bad idea, as it implies a wishy-washy language specification. Is code legal or is it not?

NO NO NO! May I be so bold as to reword John's paragraph according to how I see the issues.

"
I assume you mean the user's source code- but what if the user's code is
what the coder intended?  Or more importantly, not want the really coder
intended.  The compiler's assumptions are not always the same as the
coder's intentions...  if it was always the same and the warnings always
pointed that out, then they'd be useless.
"

The "-w" switch is useful to seeing where my understanding of what the code says and what the compiler thinks it says, is different.

-- 
Derek
(skype: derek.j.parnell)
Melbourne, Australia
"Down with mediocracy!"
8/02/2006 12:20:01 PM