July 10, 2008
Markus Koskimies:
> For me, it could even warn about indentation quirks, like:
> 
> 	...
> 	if(a == b)
> 		do_that();
> 		do_that_also();
> 	...
> 
> ...In which case the compiler could stop and say, that either add {}'s or correct the indentation :)

Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute.

Bye,
bearophile
July 10, 2008
JAnderson wrote:

> The more warnings as errors the better.  If I have to suffer a little for false positives *shrug*

What do you understand by "a little"?

Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php? art_group=digitalmars.D&article_id=73441

Do you recognize how many warnings a lint tool might emit on that code? Would you admit then, that a paranoic lint would be quite useless, even if it detects that the variable `p' should be accessed? Would you admit, that you yourself are unable to decide whether the presence of some access statements to `p' should suppress the warning?

My understanding of lint tools is, that they incorporate a collection of programming patterns together with a fuzzy recognition algorithm. If there are enough hits for a specific pattern, but it is still only partial implemented, then warnings are generated. Under this the primary question is: what is so special to the collection of programming patterns that they can be formalized into a lint tool but not be used as paradigms in the source language?

-manfred

-- 
Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
July 10, 2008
Manfred_Nowak wrote:
> JAnderson wrote:
> 
>> The more warnings as errors the better.  If I have to suffer a
>> little for false positives *shrug*
> 
> What do you understand by "a little"?

I don't understand what your asking.  I meant that if I have to fix it because the compiler tells me its an error then so be it.  Its a little pain for a lot of gain.

> 
> Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php?
> art_group=digitalmars.D&article_id=73441 
> 
> Do you recognize how many warnings a lint tool might emit on that code?
> Would you admit then, that a paranoic lint would be quite useless, even if it detects that the variable `p' should be accessed?

I don't understand?  With lint it just gives you hints about what could be wrong.  You pick and choose what to fix.

> Would you admit, that you yourself are unable to decide whether the presence of some access statements to `p' should suppress the warning?

I would prefer this be an error like C#.  In C++ because all my warnings are errors it would be an error too.  If you really want to use an uninitialized variable there should be a work around but it should be harder to do.

> My understanding of lint tools is, that they incorporate a collection of programming patterns together with a fuzzy recognition algorithm. If there are enough hits for a specific pattern, but it is still only partial implemented, then warnings are generated. Under this the primary question is: what is so special to the collection of programming patterns that they can be formalized into a lint tool but not be used as paradigms in the source language?

For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors).  This means the lint effort can be separate.  It means they can continually add and remove checks while the compiler is worked on as a separate effort.  Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well.  I simply don't add an identifier or I semicolon the value when I'm writting stubs.

> 
> -manfred  
> 
July 10, 2008
On Thu, 10 Jul 2008 05:00:54 -0400, bearophile wrote:

[...]
> Let's see... maybe just using indentation?
[...]

Aa, I'm a big fan of Python and I wouldn't complain if D would use the same method for determining blocks ;D
July 10, 2008
Bruce Adams wrote:
> On Wed, 09 Jul 2008 09:49:34 +0100, Walter Bright <newshound1@digitalmars.com> wrote:
> 
>> Here are some horrid examples from my own code which, to please the client, had to compile with all warnings on for MSVC:
>>
>> ---
>>    p = NULL;  // suppress spurious warning
>> ---
>>    b = NULL;  // Needed for the b->Put() below to shutup a compiler use-without-init warning
>> ---
>>    #if _MSC_VER
>>    // Disable useless warnings about unreferenced formal parameters
>>    #pragma warning (disable : 4100)
>>    #endif
>> ---
>>    #define LOG 0       // 0: disable logging, 1: enable it
>>
>>    #ifdef _MSC_VER
>>    #pragma warning(disable: 4127)      // Caused by if (LOG)
>>    #endif // _MSC_VER
>> ---
>>
>> Note the uglification this makes for code by forcing useless statements to be added. If I hadn't put in the comments (and comments are often omitted) these things would be a mystery.
> 
> I would contend this is a problem with the quality of headers provided by M$.
> Library code has a greater need to be high quality than regular code.
> Operating system APIs even more so.
> Removing warnings from C/C++ headers requires you to write them carefully to
> remove the ambiguity that leads to the warning. That is, this definition of
> quality is a measure that increases with decreasing semantic ambiguity.

I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality.
Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning.
In C++, error-free code is clearly defined in the spec. But warning-free
code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.

If you insist on avoiding all warnings, you're effectively using the programming language spec which one individual has carelessly made on a whim.

For example, VC6 generates some utterly ridiculous warnings. In some cases, the chance of it being a bug is not small, it is ZERO.

In DMD, the signed/unsigned mismatch warning is almost always spurious. Getting rid of it reduces code quality.
July 10, 2008
JAnderson wrote:
> Manfred_Nowak wrote:
>> JAnderson wrote:
>>
>>> The more warnings as errors the better.  If I have to suffer a
>>> little for false positives *shrug*
>>
>> What do you understand by "a little"?
> 
> I don't understand what your asking.  I meant that if I have to fix it because the compiler tells me its an error then so be it.  Its a little pain for a lot of gain.
> 
>>
>> Please look at the example from http://www.digitalmars.com/webnews/newsgroups.php?
>> art_group=digitalmars.D&article_id=73441
>> Do you recognize how many warnings a lint tool might emit on that code?
>> Would you admit then, that a paranoic lint would be quite useless, even if it detects that the variable `p' should be accessed?
> 
> I don't understand?  With lint it just gives you hints about what could be wrong.  You pick and choose what to fix.
> 
>> Would you admit, that you yourself are unable to decide whether the presence of some access statements to `p' should suppress the warning?
> 
> I would prefer this be an error like C#.  In C++ because all my warnings are errors it would be an error too.  If you really want to use an uninitialized variable there should be a work around but it should be harder to do.

Before someone else corrects me.  This is not an error in C# I was thinking of "used uninitialized variable" not "variable not used".  I And I still prefer errors for these.

> 
>> My understanding of lint tools is, that they incorporate a collection of programming patterns together with a fuzzy recognition algorithm. If there are enough hits for a specific pattern, but it is still only partial implemented, then warnings are generated. Under this the primary question is: what is so special to the collection of programming patterns that they can be formalized into a lint tool but not be used as paradigms in the source language?
> 
> For me, anything that isn't really an error (and I think a lot more of C++ warnings should be errors).  This means the lint effort can be separate.  It means they can continually add and remove checks while the compiler is worked on as a separate effort.  Things like unused variables might be a candidate however being the pedantic coder that I am, I prefer them as errors as well.  I simply don't add an identifier or I semicolon the value when I'm writting stubs.
> 
>>
>> -manfred 
July 10, 2008
"bearophile" <bearophileHUGS@lycos.com> wrote in message news:g54j46$2e05$1@digitalmars.com...
> Markus Koskimies:
>> For me, it could even warn about indentation quirks, like:
>>
>> ...
>> if(a == b)
>> do_that();
>> do_that_also();
>> ...
>>
>> ...In which case the compiler could stop and say, that either add {}'s or correct the indentation :)
>
> Or maybe... I have a revolutionary idea: just express to the compiler what you mean once, not using two different means that (for mistake) may say conflicting things. Let's see... maybe just using indentation? This seems a revolutionary idea, surely no one has put it to practice... oh, Knuth has expressed the same idea more than 30 years ago... how cute.
>
> Bye,
> bearophile

At the risk of reliving an old discussion...

http://dobbscodetalk.com/index.php?option=com_myblog&show=Redundancy-in-Programming-Languages.html&Itemid=29

In the case of Python (I assume that's the same sort of behavior as the Knuth you mention), the whole point behind the way it handles scope/indentation was to correct the problem of source files that have improper indentation by actually enforcing proper indentation. That's a very worthy goal. But the problem is in the way it goes about it: Python doesn't enfore a damn thing with regard to indentaion. It *can't* inforce proper indentation because it runs around assuming that the intentation it receives *is* the intended scope. So it can't enforce it just because doesn't have the slightest idea what the proper indentation for a particular piece of code would be - that would require separating scope from indentation and...oh, yea, that's what C-based languages like D do.


July 10, 2008
"Markus Koskimies" <markus@reaaliaika.net> wrote in message news:g549hh$1h9i$2@digitalmars.com...
> On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:
>
>> I *also* want a tool (or sheaf of tools, smart editor, etc.) that will do lint-like static analysis and style vetting to warn me that, yes, this is legal D but you're using it in an obscure or unmaintainable or not easily extensible or not easily understood manner. _But_I_don't_want_that_tool_to_be_the_compiler_!
>
> Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it.

Right. See, even if you don't want that tool to be your compiler...you don't have to turn that feature on. If I want to use a TV remote, I can do so without dealing with the buttons that are built into the TV.

>> Walter is right that you end up with effectively 2**n different languages depending, not only on which warnings you enable|disable, but also on whether the shop you work for demands that you compile at /W1 or /W3 or /W4 and does or doesn't treat warnings as errors.
>
> Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)
>

I'm not sure I see the need for as many as four warning levels (though I suppose I could be convinced given an appropriate argument), but something like this sounds ideal to me:

- enable typically-useful warnings
- enable anally-retentive, only sometimes-helpful, warnings

- treat typically-useful warnings as errors
- treat all warnings as errors

>> I applaud Walter for not making that error. And I want him focused on writing a knife-clean compiler that stabs illegal code in the heart, and trusts the programmer to have meant what he said when the code is legal, even if it's "excessively clever".
>
> Heh, I like compilers that does not over-estimate the cleverness of the
> developer, but instead think that they (compilers) are the smarter
> part ;) Although being well known with syntax and best practices of a
> language, many times I write something else than I thought that I wrote.
> For catching these kind of spurious "miswritings", there are "syntactic
> salt" in many languages, including D. But at some point I think that it's
> no use to add more this salt, but instead do static checks to make the
> language better.
>

At the risk of a "me too" post...Me too ;)


July 10, 2008
"Markus Koskimies" <markus@reaaliaika.net> wrote in message news:g54b6m$1h9i$4@digitalmars.com...
> On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:
>
>> In a "properly defined language", how would you solve the problem of unintentionally-unused variables?
>
> My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects.
>
> For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used.
>
> Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out.
>
> In similar manner, warn about conditional expressions that have constant
> value (like "uint a; if(a > 0) { ... }"), code that has no effect and all
> those things :)
>

I'd prefer a warning, but I'd be fine with all this.

> And yes, warnings could be considered as "optional errors" for us who think that it's best to tackle all sorts of quirks & potential bugs at compile time and not trying to find them with runtime debugging. As long as the warning makes some sense and can be circumvented in some reasonable way, just throw it to my screen :)

I, too, like to tackle all that stuff right when I compile. But whenever I've refered to warnings as "optional errors" here, what I meant was that it's impossible to turn off "treat warnings as errors".  Even if warnings are not treated as errors, they're still going to show up on your screen (provided you at least enabled them, of course), so you can still choose to deal with them right then and there.

The benefit is that you wouldn't have to fix (or wait for a fix for) any warnings in any third-party source libraries you use. Also, while I can't confirm or deny this at the moment, someone here said that -w compiles currently halt at the first warning. If that's the case, then disabling "treat warnings as errors" would let you see all the warnings at once, not just the first one. Plus, allowing "treat warnings as errors" to be disabled would decrease the strength of the phenomenon Walter and others described where warnings effectively create multiple versions of the same language. The pheonomenon would only occur in places that take "No warnings allowed!" to an obsessive/compulsive/irrational level (rather than a merely sensible level), instead of happening to everybody.


July 10, 2008
On Thu, 10 Jul 2008 11:20:21 +0100, Don <nospam@nospam.com.au> wrote:

> Bruce Adams wrote:

>>  I would contend this is a problem with the quality of headers provided by M$.
>> Library code has a greater need to be high quality than regular code.
>> Operating system APIs even more so.
>> Removing warnings from C/C++ headers requires you to write them carefully to
>> remove the ambiguity that leads to the warning. That is, this definition of
>> quality is a measure that increases with decreasing semantic ambiguity.
>
> I think it's a complete fallacy to think that lower-number-of-warnings is proportional to better-code-quality.
> Once a warning is so spurious (eg, so that it has a <1% chance of being an error), it's more likely that you'll introduce an error in getting rid of the warning.
> In C++, error-free code is clearly defined in the spec. But warning-free
> code is not in the spec. You're at the mercy of any compiler writer who decides to put in some poorly thought out, idiotic warning.
>
I didn't say that *overall* quality is related to lower warnings but it is a factor.
There are other factors that are typically more significant. Still given the choice between
code with some warnings and warning free code all other things being equal I would
pick the warning free code. You obviously shift your quality measure towards that
aspect of readability. Personally I think the impact on readability is minimal.

> If you insist on avoiding all warnings, you're effectively using the programming language spec which one individual has carelessly made on a whim.
>
While some warnings are less useful than others I don't think its fair in
general to say they're introduced carelessly on a whim.

> For example, VC6 generates some utterly ridiculous warnings. In some cases, the chance of it being a bug is not small, it is ZERO.
>
Before they got Herb Sutter on board VC++ was notoriously bad.
If that's true then it would be a compiler bug. If you know it to be true
you can disable the warning with a pragma. Similarly in gcc all warnings are
supposed to have an on/off switch. So you get to choose which warnings
you think are important. I am well aware that some people choose to ignore
all warnings in order to code faster. In general its a false economy like
not writing unit-tests.


> In DMD, the signed/unsigned mismatch warning is almost always spurious. Getting rid of it reduces code quality.

I have encountered quite a few bugs (in C++) relating to unsigned/signed
mismatches. Its a very subtle and hard to spot problem when a simple addition
suddenly changes the sign of your result. It costs a ugly cast to remove the
warning but that is a trade I'm prepared to make to never have to worry about
such bugs.

Regards,

Bruce.