July 10, 2008
Nick Sabalausky Wrote:

> "superdan" <super@dan.org> wrote in message news:g53831$20jk$1@digitalmars.com...
> > Steven Schveighoffer Wrote:
> >
> >> "Walter Bright" wrote
> >> > The reason for treating warnings as errors when warnings are enabled is
> >> > so
> >> > that, for a long build, they don't scroll up and off your screen and go
> >> > unnoticed.
> >>
> >> I've been following this thread, and I'm not really sure which side of
> >> the
> >> issue I'm on, but this, sir, is one of the worst explanations for a
> >> feature.
> >> Ever heard of 'less'?  or 'more' on Windows?  Maybe piping to a file?
> >> Maybe
> >> using an IDE that stores all the warnings/errors for you?
> >>
> >> Please stop saving poor Mr. ignorant programmer from himself.  Education
> >> is
> >> the key to solving this problem, not catering to the ignorance.
> >>
> >> Sorry for the harshness, but seriously!
> >
> > in c++ this kind of argument that contains "it's an issue of education and shit" in it has been used for many years. after a lot of experience in the field nowadays everyone silently agrees that that argument is useless. folks on comp.lang.c++ start mocking you if u bring that argument up.
> >
> 
> That's probably because over the past ten years, the people who care more about doing things the right way than catering to the status quo have been leaving C++ en masse (hence, D). It's no surprise that the people still remaining onboard C++ are either A. people who hold that particular viewpoint or B. people who are required to use C++ for some reason and have long since gotten used to the fact that C++ is never going to fix most of its problems. So I wouldn't place too much weight on the "comp.lang.c++" take on this particular issue; their consensus is likely just a reflection of group dynamics.

group was given as an example. the thing is it has become clear to the luminaries that invoking better education is not an answer. it is clear from the literature and also from c++ oh ecs.

> > i am 110% on walter's side on this shit. there should be no warnings and shit. only errors. it is not catering to the ignorant. it is a matter of a properly defined language.
> >
> 
> That's right, no true warnings, but just a handful of what are in effect "optional errors".
> 
> In a "properly defined language", how would you solve the problem of unintentionally-unused variables?

first i'd stop bitching why oh why the language does not build that shit in. that would be a great start. give me my fucking soapbox again. there. thanks. too many people around here are trigger happy about changing the language. (next breath they yell they want stability.) has nothing to do with you but reminds me of shit goin' on here in this group.

moron: "d has no bitfields. somehow in my fucking world bitfields are so essential, i can't fucking live without them. hence i can't use d. give me bitfields and i'll give you my girlfriend."

months go by.

walter: "here, there are perfectly functional bitfields in std.bitmanip. they're more flexible and more rigorously defined than in fucking c. you can count on'em."

moron: "don't like the syntax. still won't use d. i want them in the language. put them in the language and i'll use d."

> Adopt the "unused" keyword that Koroskin Denis proposed and say that an unused var without the unused keyword is an error, and accessing a var that does have the unused keyword is also an error?

once i stop bitching i get a clearer mind and I get to write some shit like this.

void vacuouslyUse(T)(ref T x) {}

void foo()
{
    int crap;
    vacuouslyUse(crap);
    ................
}

use and remove as you wish.

> That sounded good to me at first but then I realized: What happens when you're in the middle of an implementation and you stick the "unused" keyword on a variable in a function that you've only partially implemented just because you want to test the partial implementation. Then you fix any problems, get distracted by something else, and forget to finish (it happens more than you may think). Well great, now that wonderful compiles/errors dichotomy has just *created* a hole for that bug to slip in, whereas a real warning (the true kind, not the "warnings as errors" kind) would have caught it.

unused name should be an error. if you want to not use something, you must sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.

> So how else could a "properly defined language" solve it? Just simply treat it as a non-error as it is now and be done with it? That turns potentially-noisy errors into silent errors which is one of the biggest design mistakes of all.
> 
> Any other suggestions on how to "properly design" a fix for that? If it works, I'd be all for it.

it works but i kinda doubt you'll be all for it. you don't want to solve the unused variable problem. you want compiler warnings. somehow you'll work your argument out to make my solution undesirable.

> Suppose that does get fixed. Now, when some other common gotcha is discovered in a language, or a particular version of a language, that's had a design freeze (like D1), then what do you do? Stick to your "warnings are bad" guns and just leave everyone tripping over the gotcha in the dark, maybe hoping that someone else could come along and create a lint tool that would do the job that you could have already done?

this is an imperfect world. i see value in the no warning stance. you don't see. therefore when competition in d compilers arena will pick up i'd see a warning as a shitty concession, while you will grin "i told ya all along".

> Designing everything to fit into a compiles/errors dichotomy is great, in theory. But in practice it's just unrealistic. Even Walter ended up having to add a few "warnings" to D (even if he implemented them more as optional errors than as true warnings). Which is why, as I was saying in the beginning, trying to eliminate the need for a specific warning is great - *if* it actually pans out. But that doesn't always happen.

when doesn't it happen?

> > a lint tool should not be folded into d. such a tool could e.g. follow pointers, do virtual execution, and some other weird shit. it could run for hours and produce output that takes an expert to interpret. that kind of shit does not belong in the compiler.
> 
> Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.

weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?
July 10, 2008
superdan wrote:
> walter: "here, there are perfectly functional bitfields in
> std.bitmanip. they're more flexible and more rigorously defined than
> in fucking c. you can count on'em."

I'd like to take credit for std.bitmanip, but it's Andrei's design and effort.
July 10, 2008
Nick Sabalausky wrote:
> But I worry that eventually we'll get to some point where all code either is or can be generated straight from "intents" syntax. Now that certainly sounds great, but at that point all we really would have done is reinvent "programming language" and we'd be left with the same problem we have today: how can we be sure that the "code"/"intents" that we wrote are really what we intended to write? The solution would have just recreated the problem.

Back in the 80's, there was a heavily advertised product that touted "no more programming necessary". All you had to do was write in their "scripting language" and the product would read that and do all the "programming" for you.

I thought it was hilarious.
July 10, 2008
"superdan" <super@dan.org> wrote in message news:g53ms5$h6n$1@digitalmars.com...
>
> group was given as an example. the thing is it has become clear to the luminaries that invoking better education is not an answer. it is clear from the literature and also from c++ oh ecs.
>
>
[rambling, barely-readable cuss-fest trimmed]
>
> once i stop bitching i get a clearer mind and I get to write some shit like this.
>
> void vacuouslyUse(T)(ref T x) {}
>
> void foo()
> {
>    int crap;
>    vacuouslyUse(crap);
>    ................
> }
>
> use and remove as you wish.
>
> unused name should be an error. if you want to not use something, you must sweat a little. vacuouslyUse fits the bill exactly. should be in phobos.

I would still prefer it to be a warning (that way it would keep nagging me when I forget to finish up and take out the temporary vacuouslyUse), but at this point I could live with this compromise. It would certainly be a lot better then the total silence it gives me now.

>
> it works but i kinda doubt you'll be all for it. you don't want to solve the unused variable problem. you want compiler warnings. somehow you'll work your argument out to make my solution undesirable.
>
>
> this is an imperfect world. i see value in the no warning stance. you don't see.

I see value in warnings, you don't see. The imperfect world fact only serves to illustrate that taking sound practical advide ("the need for warnings should be minimized") to a unilateral extreme ("all warnings are always bad") just doesn't typically work out. Remember when the Java folks were trying to tell us that nothing should ever be non-OO?

> therefore when competition in d compilers arena will pick up i'd see a warning as a shitty concession, while you will grin "i told ya all along".
>

I'm well aware of the difference between truth and popular opinion.

>> Designing everything to fit into a compiles/errors dichotomy is great, in
>> theory. But in practice it's just unrealistic. Even Walter ended up
>> having
>> to add a few "warnings" to D (even if he implemented them more as
>> optional
>> errors than as true warnings). Which is why, as I was saying in the
>> beginning, trying to eliminate the need for a specific warning is great -
>> *if* it actually pans out. But that doesn't always happen.
>
> when doesn't it happen?
>

As just a few examples: http://www.digitalmars.com/d/1.0/warnings.html

>>
>> Anything like that can be attached to an optional command-line parameter that defaults to "off". Problem solved.
>
> weak argument. a good program does some shit and does it well. i'm pissed that emacs can browse the web already, alright?

Trying to convince a Unix-hater of something by appealing to Unix-values is kinda like using the bible to convince an athiest of someting. But, I'm well aware that debating the merits of Unix-philosophy to a Unix-fan is equally fruitless, so I'm going to leave this particular point at that.


July 10, 2008
Walter Bright wrote:
> Nick Sabalausky wrote:
>> I don't suppose there's any chance of DMD getting a warning for variables/arguments that are declared but never accessed? Just today alone there's been two bugs I spent 10-30 minutes going nuts trying to track down that turned out to be variables I had intended to use but forgot to. 
> 
> So, why not just turn off the warnings?
> 
> The problem with warnings is that if there are n warnings, there are essentially n factorial different versions of the language. <Snip>

> There is a place for warnings, however. That is in a separate static analysis tool (i.e. lint, coverity, etc.) which can be armed with all kinds of heuristics with which to flag questionable constructs. I don't think they should be part of the compiler, however.

Something like lint can be run and have a consistent output on every D compiler and platform since it doesn't care about building the actual platform specific code.  Having no warnings in the compiler means you can't have hardware/platform specific warnings.  And I think thats great.

Personally in C++ I like to turn to warning level 4 with warnings as errors and run both a GCC and a MSVC++ compile (when working on multiple platforms).  Most warnings can removed without use of pragma and using both compilers gives a much better coverage of warnings.

I'm personally a fan of catching things early.  The more warnings as errors the better.  If I have to suffer a little for false positives *shrug* however its much better then spending hours in a mud pit full of crocodiles; that is debugging.

-Joel
July 10, 2008
On Wed, 09 Jul 2008 15:13:15 -0700, Davidson Corry wrote:

> I agree with Walter. One of the driving forces behind D was a desire *not* to have the quirks, corners and obscurities that grew within C++ over the years because Stroustrup wanted full backwards compatibility with C, etc.

This part I agree. D is a great language, and it has been my "home language" for years (replaced C++).

> I want a compiler that says *this* is legal D, *that* is not, and there's an end on it.

Maybe unused local vars, arguments or static arrays would be defined not
to be legal D? :)

> I *also* want a tool (or sheaf of tools, smart editor, etc.) that will do lint-like static analysis and style vetting to warn me that, yes, this is legal D but you're using it in an obscure or unmaintainable or not easily extensible or not easily understood manner. _But_I_don't_want_that_tool_to_be_the_compiler_!

Oh, I would like to see that as a part of a compiler. In fact, the more the compiler generates warnings, the more I like it. For me, it could even warn about indentation quirks, like:

	...
	if(a == b)
		do_that();
		do_that_also();
	...

...In which case the compiler could stop and say, that either add {}'s or correct the indentation :)

> Walter is right that you end up with effectively 2**n different languages depending, not only on which warnings you enable|disable, but also on whether the shop you work for demands that you compile at /W1 or /W3 or /W4 and does or doesn't treat warnings as errors.

Ah, there needs only be one warning level - enable all, and regard warnings as errors. Who wants to disable warnings? Who want only see part of warnings? Just no use, IMO it's just OK to put all of them to screen and not to compile until the programmer has corrected those :)

> I applaud Walter for not making that error. And I want him focused on writing a knife-clean compiler that stabs illegal code in the heart, and trusts the programmer to have meant what he said when the code is legal, even if it's "excessively clever".

Heh, I like compilers that does not over-estimate the cleverness of the developer, but instead think that they (compilers) are the smarter part ;) Although being well known with syntax and best practices of a language, many times I write something else than I thought that I wrote. For catching these kind of spurious "miswritings", there are "syntactic salt" in many languages, including D. But at some point I think that it's no use to add more this salt, but instead do static checks to make the language better.

As a very simple example, the current DMD warns about this:

---
void error(string msg)
{
    writefln(msg);
    exit(-1);
}

int doSomethingWith(string a)
{
    if(a == null)
    {
        error("A null string");
    }
    else
    {
        return a.length;
    }
}
---
$ dmd test.d
warning - xxx: function test.doSomethingWith no return at end of function

...Since it does not understand that exit never returns (yes I know that that case should be written in different manner, but it is just an example). It could be told e.g. with some new return type (instead of "void exit" you would write "never_return exit"), and of course the analysis should go through the possible execution flows to check, which parts of the code may return and which parts cannot. Similar cases occur with switch statements.

What I try to say, is that IMO it is impossible to think that language, compiler (code generation) and static checking are three separate things. If there is a good synergy between these three elements, the language is great. But that's just my opinion...
July 10, 2008
On Wed, 09 Jul 2008 13:41:35 -0700, Walter Bright wrote:

> Nick Sabalausky wrote:
>> "Walter Bright" <newshound1@digitalmars.com> wrote in message news:g530j8$18th$1@digitalmars.com...
>>> The reason for treating warnings as errors when warnings are enabled is so that, for a long build, they don't scroll up and off your screen and go unnoticed.
>> 
>> Pardon me for saying so, but that doesn't sound like a very convincing reason to turn every warning (which, by it's very nature, is something that might not be a bug) into something that splits the language into what are effectively different languages.
> 
> If you turn warnings on, then you want to see them and presumably deal with them. If you don't deal with them, then they persist every time you compile, and either they get very irritating and you fix them anyway, or you develop a blind spot for them and never see the ones you do want to fix.

I completely agree with this. If warnings are generated, it's best to stop compilation and let the developer to correct those parts.

Warnings, that does not stop building process have no use at all.
July 10, 2008
Nick Sabalausky wrote:

> Can you prove
[...]
> Do you have a general-case proof

No---and in addition no one is obliged to have a counter proof for any claim, especially not if the claim is not formalized.

I have a counter hint only: D as an intended systems programming language has `cast'- and `asm'- statements as well as pointers available. With these a clever coder might be able to access every data storage location accessable to the program, regardless of the protection status announced by the source.

-manfred

-- 
Maybe some knowledge of some types of disagreeing and their relation can turn out to be useful: http://blog.createdebate.com/2008/04/07/writing-strong-arguments/
July 10, 2008
On Wed, 09 Jul 2008 17:53:52 -0400, Nick Sabalausky wrote:

> In a "properly defined language", how would you solve the problem of unintentionally-unused variables?

My suggestion: just give error. No need for "unused" keyword, just comment out code that has no effects.

For function arguments, if they are unused but mandatory because of keeping interface, leave it without name if it is not used.

Furthermore, give also errors unused private/static things. If they are not used, why are they in the code? Just comment them out.

In similar manner, warn about conditional expressions that have constant value (like "uint a; if(a > 0) { ... }"), code that has no effect and all those things :)

And yes, warnings could be considered as "optional errors" for us who think that it's best to tackle all sorts of quirks & potential bugs at compile time and not trying to find them with runtime debugging. As long as the warning makes some sense and can be circumvented in some reasonable way, just throw it to my screen :)
July 10, 2008
On Thu, 10 Jul 2008 06:17:22 +0000, Markus Koskimies wrote:

Well, I need to share this experience with you; I have been debugging one of my segfaulting D programs for a few hours, and I finally found the reason for that. A shortened version:

	foreach(...)
	{
		Arg arg = null;

		...
		if( ... )
		...
		else if( ... )
		{
			...
			arg = somefunc( ... );
		}
		else if( ... )
		...
		else if( ... )
		{
--->			someotherfunc( ... ); <---
		}
		...
	...
	}

Those functions return Arg class objects, but earlier they returned voids (and used Arg objects as parameters). When modifying the code I forgot to store the return value in one of the execution paths --> segfaults.

Having "optional error" called a warning about not using return value of a function would have saved a lots of time, cigarettes, coffee and headache :D