February 07, 2006
"Sean Kelly" <sean@f4.ca> wrote in message news:ds927o$156u$1@digitaldaemon.com...
> I'm not sure I see the problem.  Since the warnings mode is optional, what's to stop you from building with warnings until you fix everything it flags that you care about then turn the option off and build to test?
>
> My only issue is that with "warnings as errors" as the only warnings-enabled option, there must be a code-level way to fix anything the compiler warns about.  Otherwise, shops that build with warnings enabled as the release mode would be SOL if the compiler warns about something intended.
>
> This is typically how I do things with VC++: require no messages at WL:4 and preprocessor directives to situationally disable any warnings about utterly stupid things (like the "standard function is deprecated" business in VC 2005).  Things like implicit narrowing conversions are flagged by the compiler and casts are added where appropriate, etc.  To me, fixing warnings is mostly a matter of either fixing the bug or telling the compiler "yes, I meant to do that."  I can't think of a situation where I'd want to ship with some warning messages still displaying, assuming I cared enough to enable them.

You're replied on my last post, but none of this seems to address anything I raised. Did you reply on the wrong part?

If this is a reply to my post, then I'm stumped. I cannot express my point - that seems to me to be one of the most obvious things ever discussed on this NG - any more clearly. Maybe I just have to assume I'm mad. (Not entirely without potential, as a hypothesis.)

For the record, I'm not talking about shipping with warnings still firing. That's madness to do, and I suspect my opinion on that, and practice in not doing so, to be identical to your own. (Interestingly, regarding the reluctance to allow compiler flags for warning suppression: the only warnings I ever get, and which I am dangerously forced to ignore wholesale, are from linkers, since there seems to be no mechanism to advise them to selectively shut up. I know that's only real world C and C++ experience, and therefore of little consequence to the Decision-making process, but I think it makes its point pretty clearly.)

Anyway, my entire point is tightly focused on the contradiction between Walter's stock technique for debugging, which he advises all comers to adopt whenever they report a problem, and the restrictions imposed by the DMD compiler that directly defeat that advice. I await that point being addressed, rather than some deflectory digression, with exteme patience.

:-)



February 07, 2006
> > Anway, here's the situation as I see it, as I can most plainly express it:
> >
> > 1. You recommend that people chop down code in order to isolate bugs. Such bugs can include compile-time and runtime bugs.
> >
> > 2. You offer a compiler that does not allow a user to generate code
while
> > observing warnings.
> >
> > These two are in obvious conflict. Anyone with any problem compiling any
> > compiled-language will, in following your stipulation to the
> > chop-and-test
> > approach, have to remove code. Those with a version control system, a
> > huge Undo buffer and perfect faith in their IDDE not to crash may
> > genuinely
> > _remove_ code.
>
> Or copy the file(s) involved and chop the cop(y|ies).

Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task. To have to be making potentially 10s or even 100s of copies of a given file, and keeping track of what changes go where, is going to turn it into a really time-consuming and horrible arduous task.

I don't want to always be coming across as this nasty sarcastic English cynic, who just refuses to buy into the Dream, but D is, for one thing, touted as being superior in productivity to C/C++. And then Walter imposes restrictions on the dominant (and largely defining) compiler based on two things: his needs/wants/experience as a compiler writer, and his experience/practices as a C/C++ coder. While I have huge respect for his achievements in both these fields, they do not represent *all* knowledge/experience/practice (nor even the best, perhaps). _In this particular case_, Walter's ideas/prejudices/whatever are a complete contradiction of the claims of D, hindering rather than helpful.

As my mother always said (and continues to say) to me: "Watch that hubris".

> > Those more careful/careworn might instead put in an early
> > return statement here and there. Since the last thing one wants to do
> > when
> > debugging is to introduce new bugs, such users might sensibly wish to
> > continue to be informed of potential problems by having their compiler
> > emit as many warnings as it can.
>
> True, but not strictly required to produce a test case.

Not at all. There are many ways. But why proscribe a more productive one for less productive ones in the name of one person's practice.

And, before anyone says it, Yes, I know it's Walter's language, and he can do what he likes. My radical hypothesis is that by facilitating the (different) practices of other developers, more developers are likely to be attracted to the language.



February 07, 2006
"Walter Bright" <newshound@digitalmars.com> wrote in message news:ds9a6g$1b2m$1@digitaldaemon.com...
>
> "Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds90n7$1411$1@digitaldaemon.com...
> > "Walter Bright" <newshound@digitalmars.com> wrote in message news:ds8lvv$s2k$1@digitaldaemon.com...
> >> "Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds8jkh$qf8$1@digitaldaemon.com...
> >> > I'm still waiting for Walter to explain the paradox between the
> >> > bug-finding
> >> > activities that he recommends to just about everyone that ever raises
a
> >> > compiler problem and not being able to generate code and link with
> >> > something
> >> > that emits a warning. (I may be waiting some time ...)
> >> If you can show an example where being able to execute code that
> >> generates a
> >> warning will make the cause of the warning less mysterious, I'd be very
> >> interested.
> > Are you affecting obtuseness as a means of stimulating debate for the
> > people
> > reading but not participating? (Not rhetoric. I'm genuinely confused.)
>
> I genuinely have no idea what the problem is you're talking about.
>
>
> > Anway, here's the situation as I see it, as I can most plainly express
it:
> > 1. You recommend that people chop down code in order to isolate bugs.
Such
> > bugs can include compile-time and runtime bugs.
> > 2. You offer a compiler that does not allow a user to generate code
while
> > observing warnings.
> >
> > These two are in obvious conflict.
>
> No, they aren't. If the person doesn't understand why the warning is being generated, executing the code won't help in the slightest. If it isn't
about
> the warning, just turn off warnings. Turning off the warnings isn't any
more
> work than turning on "warn but keep going".
>
> > Anyone with any problem compiling any
> > compiled-language will, in following your stipulation to the
chop-and-test
> > approach, have to remove code.
>
> You write as if there's something wrong with the chop and test approach,
or
> that it's something unique. It's not unique (it's taught at better universities). It's a generally and widely used problem solving technique, also known as divide-and-conquer. And it's the only one that has a reasonable chance of success with a complex system. I've spent 30 years in engineering, and those people who do not know how to reduce the problem domain spent hours, days, *months* futzing about chasing phantoms, wasting time, trying things essentially at random, and never figuring out what's going wrong.
>
> > Those with a version control system, a huge
> > Undo buffer and perfect faith in their IDDE not to crash may genuinely
> > _remove_ code. Those more careful/careworn might instead put in an early
> > return statement here and there. Since the last thing one wants to do
when
> > debugging is to introduce new bugs, such users might sensibly wish to
> > continue to be informed of potential problems by having their compiler
> > emit
> > as many warnings as it can.
>
> The way to do chop-and-test is to make a copy of your project, and work
from
> the copy. Or at least make a full backup first. If you do not make a copy first, you are nearly guaranteed to screw up your source code, whether or not you are using CVS or have undo buffers. Warnings aren't going to help
in
> the slightest.
>
> There's no reason whatsoever to worry about introducing new bugs doing
this.
>
> If you are in a position where you are somehow unable to back up your project or work from a copy of it, then for whatever reason you've completely lost control over the project and have much bigger problems
than
> warnings.
>
>
> > Please point out the flaw in my reasoning of your paradox, and, further, explain to me how, in following this technique in many circumstances
where
> > debugging has been necessitated in C/C++/D/Ruby/Java/.NET, I have gone
so
> > astray in my practice. What should I have been doing instead?
>
> Make a full copy of your project, and work from that.
>
> > This is not
> > rhetoric for the sake of it: please do elucidate. If you cannot, please
> > explain how DMD's current contradiction of 1 and 2 is not an
> > arbitrary shackle to its users,
>
> You're worried about screwing up your project by applying divide-and-conquer, so you want warnings to continue. That is a misunderstanding of how to go about it. Make a copy of your project. Then hack and slash the copy until you've isolated the problem. Turn off the warnings. Stop worrying about introducing a bug - it's a sacrificial copy, not the original.
>
> > one based solely on your own work practices and prejudices that,
> > I hope, you would have to concede may not map to those of all other
> > developers.
>
> Trying to use warnings as a substitute for making a backup is like a carpenter trying to hammer a nail with a screwdriver. It's the wrong
answer,
> and trying to make a screwdriver that can be used as a hammer is also the wrong answer.

I get all of the above - some of which does _not_ address my point, and rather lights little adjacent campfires for us to worry about without addressing the point (what's new?) - but that's all from the position of _your own experience_. It does not accord with mine. If I'm unique, and uniquely wrong, then don't worry about it. But if I'm not, then you're making life harder than necessary for some potential users. Since life using D's hard enough already I think it's a bad strategy for getting wider acceptance of D.

I don't expect you to change your mind now anymore than I did when I posted my original response on this thread, since you so rarely do. I just have the unfortunate personality trait of being unable to resist bursting bubbles. One of the things you claim is that D helps people work faster. This is an example where that claim is invalid.



February 07, 2006
"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9b89$1bq2$1@digitaldaemon.com...
>> Or copy the file(s) involved and chop the cop(y|ies).
> Indeed. But I think you'd agree that that adds considerable overhead on to what is usually a very chop intensive activity. chop-and-test is itself a horrible arduous task.

??

> To have to be making potentially 10s or even 100s of
> copies of a given file,

No need for more than two. I have to wonder what you are doing.

> and keeping track of what changes go where,

No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop something else. No need to keep a 'stack' of more than one change.

> is going to turn it into a really time-consuming and horrible arduous task.

If it is, you're going about it inefficiently.

I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup copy back over the editted copy. Nothing arduous.




February 07, 2006
"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9bs7$1c6t$1@digitaldaemon.com...
> I get all of the above - some of which does _not_ address my point,

I have no idea what your point was, then. Your point sounded like you were worried about introducing bugs during chop-and-test. I recommended making a copy first, then no bugs can be introduced.

> One of the things you claim is that D helps people work faster. This is an example where that claim is invalid.

I expect that whatever you're doing that results in 100's of copies of a single file is going to be slow going in any language, but that is nothing I ever recommended. Chop-and-test requires at most 2 copies.



February 07, 2006
"Walter Bright" <newshound@digitalmars.com> wrote in message news:ds9dn9$1ddr$1@digitaldaemon.com...
>
> "Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9b89$1bq2$1@digitaldaemon.com...
> >> Or copy the file(s) involved and chop the cop(y|ies).
> > Indeed. But I think you'd agree that that adds considerable overhead on
to
> > what is usually a very chop intensive activity. chop-and-test is itself
a
> > horrible arduous task.
>
> ??
>
> > To have to be making potentially 10s or even 100s of
> > copies of a given file,
>
> No need for more than two. I have to wonder what you are doing.
>
> > and keeping track of what changes go where,
>
> No need for that. It's a binary thing - does this chop make the problem disappear or the problem stay? If it stays, forget about the previous changes. If it goes away, copy the previous version back and chop
something
> else. No need to keep a 'stack' of more than one change.
>
> > is going to turn it into a really time-consuming and horrible arduous task.
>
> If it is, you're going about it inefficiently.
>
> I use an editor that automatically makes a backup copy when I save an editted file. Then, if the edit is unsuccessful, I just copy the backup
copy
> back over the editted copy. Nothing arduous.

I'd love to be able to poo-poo your statements as laughable fantasy, but I have no doubt that you believe what you're saying and that you're speaking the truth about your own experiences. What continues to sadden me is the apparent narrow-mindedness in that you project a certain belief that your experiences are enough of a sample to make pronouncements and direct tool strategy for everyone else.

I can tell you that there are plenty of times when a binary approach such as you describe has not proved enough for me, perhaps as a consequence of my working with non-trivial C++ code (templates, etc.). There are often instances where a combinatorial effect on the compiler cannot be reduced in the manner you describe, and many more where the effort involved to bring everything in one file is far more than necessary compared with modifying several in parallel. I know you don't believe me about that, but that doesn't make it not so.

Perhaps the answer is that I should not have the temerity to stretch the language to it's limits? But I think not.

I continue to support DMC++ and D and you personally, and I want all to prosper highly, but you do seem to live in a world that does not relate to me and my experiences, and I am not yet sufficiently deluded to believe that I'm unique. Common sense therefore compels me to think that you have an overly narrow and prescriptive view of software engineering, and that that influences the progress of D to its detriment.

Matthew

btw, I use chop-and-test, and recognise it's value without reservation. But the same applies for flossing one's teeth. Doesn't stop either from being a vile activity. None of which has much if anything to do with my specific point, to which I continue to await any response.


February 07, 2006
"Walter Bright" <newshound@digitalmars.com> wrote in message news:ds9o1o$1ml8$2@digitaldaemon.com...
>
> "Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9bs7$1c6t$1@digitaldaemon.com...
> > I get all of the above - some of which does _not_ address my point,
>
> I have no idea what your point was, then. Your point sounded like you were worried about introducing bugs during chop-and-test. I recommended making
a
> copy first, then no bugs can be introduced.

I made the point quite clearly.

> > One of the things you claim is that D helps people work faster. This is
an
> > example where that claim is invalid.
>
> I expect that whatever you're doing that results in 100's of copies of a single file is going to be slow going in any language, but that is nothing
I
> ever recommended. Chop-and-test requires at most 2 copies.
>
>
>



February 07, 2006
Matthew wrote:
> "Walter Bright" <newshound@digitalmars.com> wrote in message
> news:ds9o1o$1ml8$2@digitaldaemon.com...
> 
>>"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message
>>news:ds9bs7$1c6t$1@digitaldaemon.com...
>>
>>>I get all of the above - some of which does _not_ address my point,
>>
>>I have no idea what your point was, then. Your point sounded like you were
>>worried about introducing bugs during chop-and-test. I recommended making
> 
> a
> 
>>copy first, then no bugs can be introduced.
> 
> 
> I made the point quite clearly.

It's not very clear to me. Was the point actually made by Derek's post?
(your first post seemed to assume that the point had already been made),

That is, you enable warnings so that you can see what assumptions the compiler is making. You add an early return statement in order to quickly simplify the situation, with minimal side-effects. When you do this, you want to see how the compiler's assumptions have changed.
But unfortunately, the warnings-as-errors makes this impossible to do.

Right? Or have I misunderstood, too?

BTW, something that I've never seen anyone mention is the value of the -cov option for debugging. Absolutely fantastic for debugging, being able to see a 'snail trail' of where the program went. I'll never single-step through code again.

> 
> 
>>>One of the things you claim is that D helps people work faster. This is
> 
> an
> 
>>>example where that claim is invalid.
>>
>>I expect that whatever you're doing that results in 100's of copies of a
>>single file is going to be slow going in any language, but that is nothing
> 
> I
> 
>>ever recommended. Chop-and-test requires at most 2 copies.
>>
>>
>>
> 
> 
> 
> 
February 07, 2006
Walter Bright wrote:

> 
> "John Demme" <me@teqdruid.com> wrote in message news:ds6ioh$227b$1@digitaldaemon.com...
>> Walter Bright wrote:
>>> I have many years of experience with warnings, and none of it finds any use for warnings that do not stop the compiler. If someone believes there is a valid reason, they need to present a very convincing case.
>>
>> How does one ignore just one specific warning or one type of warning
>> while
>> still compiling with -w?  It'd be nice if there was some way to tell the
>> compiler that a specific line or area of code is indeed correct and to
>> not spit out a warning (or, more importantly, halt on it).
> 
> Each warning has an associated source code modification that will get rid of it.

I assume you mean the user's source code- but what if the user's code is right?  Or more importantly, not right- but how they want it.  The compiler's not always right... if it was always right and the warnings always pointed out code that was wrong, then they'd be errors.

~John
February 07, 2006
"Don Clugston" <dac@nospam.com.au> wrote in message news:dsa2t0$1vhn$1@digitaldaemon.com...
> Matthew wrote:
> > "Walter Bright" <newshound@digitalmars.com> wrote in message news:ds9o1o$1ml8$2@digitaldaemon.com...
> >
> >>"Matthew" <matthew@hat.stlsoft.dot.org> wrote in message news:ds9bs7$1c6t$1@digitaldaemon.com...
> >>
> >>>I get all of the above - some of which does _not_ address my point,
> >>
> >>I have no idea what your point was, then. Your point sounded like you
were
> >>worried about introducing bugs during chop-and-test. I recommended
making
> >
> > a
> >
> >>copy first, then no bugs can be introduced.
> >
> >
> > I made the point quite clearly.
>
> It's not very clear to me. Was the point actually made by Derek's post? (your first post seemed to assume that the point had already been made),

I said

"1. You recommend that people chop down code in order to isolate bugs. Such bugs can include compile-time and runtime bugs.

2. You offer a compiler that does not allow a user to generate code while observing warnings.

These two are in obvious conflict. Anyone with any problem compiling any
compiled-language will, in following your stipulation to the chop-and-test
approach, have to remove code. Those with a version control system, a huge
Undo buffer and perfect faith in their IDDE not to crash may genuinely
_remove_ code. Those more careful/careworn might instead put in an early
return statement here and there. Since the last thing one wants to do when
debugging is to introduce new bugs, such users might sensibly wish to
continue to be informed of potential problems by having their compiler emit
as many warnings as it can.
"

> That is, you enable warnings so that you can see what assumptions the compiler is making. You add an early return statement in order to quickly simplify the situation, with minimal side-effects. When you do this, you want to see how the compiler's assumptions have changed. But unfortunately, the warnings-as-errors makes this impossible to do.
>
> Right?

Pretty much.

Walter's point includes

 "No, they aren't. If the person doesn't understand why the warning is being
generated, executing the code won't help in the slightest. If it isn't about
the warning, just turn off warnings. Turning off the warnings isn't any more
work than turning on "warn but keep going"."

Since I know he's very smart, I sigh when I read this, as I don't know whether he's trying to quench debate before he has to actually address my point, or that he really can't see it. Either way it's sighworthy.

It ignores the clear and very important issue that one might want to continue to have warnings emitted, in order to keep the "mental picture" of the compilation output consistent, in order to ensure that any changes to try and stop the compiler from its ICEs, e.g. reordering seemingly unrelated statements, have not done anything that's a _new_ problem, in order to see if a commented out block highlights a new warning, and so on and so forth. Of course, Walter can counter that each and every change should be atomic, but it's not living in the real world, or at least not in my real world. I want to work how I work best, not how Walter works best. And anyway, that only answers one of these three that I've just tossed up.

Blah blah blah blah. I could go on, but this is all soooo obvious and so peurile, it blows my stack that we even have to discuss it.

D continues to have a foot firmly in each of the two idiotic camps of "the compiler knows better than the user" (so you can't choose to generate object code with something containg warnings however much you know you need/want to, and so on) and "the language knows better than the user" (so there aren't even any warnings of any kind for implicit numeric conversions). Like making someone at medical school do thought experiments for 7 years, and then making them do unassisted brain surgery as soon as they graduate. That we continue to have to debate this crap year on year when far more interesting and important issues remain unresolved, is a bad joke.