December 05, 2003
The following should protect (at least newbies), and my
future D students:

...
case 0:
..
break;case 1:
..
break;case 2:
..
break;
case 50:
case 51:
case 52:
..
break;case 79:
..
break;case 80:
..


December 06, 2003
I like the cut of your jib

"Georg Wrede" <Georg_member@pathlink.com> wrote in message news:bqqpek$2cbs$1@digitaldaemon.com...
> The D if-statement demands a {} instead of a semicolon.
> The switch statement could do the same with the default
> cause. Leave it out, and you get a compiler error.
>
> If you want to not do anything, then just write
> default
> {}
> and everyone should be happy.
>
> "It should not be impossible to excercise bad habits,
> it just shouldn't be too easy."
>
> PS, why doesn't the if statement issue a runtime
> exception at a missing {} ? Shouldn't the compiler
> behave the same in similar situations?   :-)
>
>


December 06, 2003
Yuck, I hope I don't run up against any of your students, that's just uuuugly!

In article <bqqt9t$2i0c$1@digitaldaemon.com>, Georg Wrede says...
>
>The following should protect (at least newbies), and my
>future D students:
>
>...
>case 0:
>..
>break;case 1:
>..
>break;case 2:
>..
>break;
>case 50:
>case 51:
>case 52:
>..
>break;case 79:
>..
>break;case 80:
>..
>
>


December 06, 2003
"Ilya Minkov" <Ilya_member@pathlink.com> wrote in message news:bqhnh3$15iv$1@digitaldaemon.com...
> I was advocating something different: it makes sense to do this as part of semantic analysis process, and bug user with arbitrary conditions like
"i'm not
> convinced this function returns" or "this switch may not cover a variable
value
> range", as long as convincing the compiler usually takes only one line,
which
> would also make code more explicit. I'm only speaking of my experiance
that i
> found this practice quite convenient. (Delphi fans vote here!)

The trouble with 'may' is you wind up inserting some piece of cruft to get the compiler to pass it. That cruft is usually dead code, and since it does not contribute to the algorithm it serves merely to confuse the maintenance programmer. Two common pieces of such cruft are an extra return statement that's never executed, and supplying an initializer to a variable that is always overwritten before ever being read. Here's the extra initializer thing:

int x;    <= compiler demands that this needs an initializer
if (...)
    x = 3;    <= actual algorithmic initialization
...
if (same condition as previous if)
    y = x;    <= x is always initialized here to 3, but well-meaning but
nagging compiler squawks it is possibly uninitialized.

Argh. Inserting the extra initializer at the top just is confusing, as the maintenance programmer thinks "where is this value ever used? Looks like some sort of bug."


> You must also consider that D advocates a different development cycle than
C++.
> Recompile often and early. In this context it makes sense that the
compiler bugs
> you about something not so important and not the runtime, especally the
compiler
> being so fast.

I agree with pushing as much off into compile time as makes sense to. But if I go too far, then we've got Pascal, and I gave up on that language after discovering that 50% of my coding time was spent fighting the compiler's best efforts to stop me from writing the kind of code I wanted to write.


> Your disagreement has probably something to do with your oposition to
warnings,
> since that's what compiler message class this kind of diagnostic could
best
> refer to. You really have to put up a paper on that so that we understand
you
> better.

Perhaps I should rename "warnings" to "nags" and my rationale would be clearer <g>.

> >Even if all the enum values are accounted for, it's certainly possible
the
> >user cast some other value into the enum type.
> Should this be legal at all?

Yes. A cast exists as an escape from the language's typing rules. Pascal didn't have a cast.

> If yes, then someone should take responsibility. Eother the cast has to
take
> care that enum only gets a legal value by throwing an exception otherwise,
or
> the programmer...

I expect the programmer to account for it, and if he fails to, hopefully the runtime will catch it and throw an exception.

> >Part of my reluctance to do that is implict fallthrough just has never
been
> >an issue for me in 25 years of programming. But the implicit
default:break;
> >has bitten me many times!
> And in the 5 years before the 25 years?
> You must also consider that you are an
> exceptional talent and a full-time programmer. I don't value
language-specific
> skills that much...

I make plenty of coding errors, just like all of us here.


December 06, 2003
>int x;    <= compiler demands that this needs an initializer
>if (...)
>    x = 3;    <= actual algorithmic initialization
>...
>if (same condition as previous if)
>    y = x;    <= x is always initialized here to 3, but well-meaning but
>nagging compiler squawks it is possibly uninitialized.
>
and then someone else comes along and changes the code to
int x;
if ( cond ) { x=3; }
...
if ( cond || other cond ) { y = x; }

and without warning we have a bug that the compiler is not telling us about.

one solution would be
int x = int.init;
to keep the compiler happy (just explicitly doing what should be done anyway)
a better would be
int x = undef;
in which case the debug build should change to;
int x = int.init;
int__x_set = false;
if ( cond ) { x = 3; __x_set = true; }
....
if ( cond || other ) { // or if (cond).
if ( !__x_set ) { throw new Error( "x not initialised", __FILE__, __LINE__ ); }
y = x;
}
//
in a release build the 'int x = undef', is just 'int x = int.init' ;


December 06, 2003
In article <bqr86c$b3$1@digitaldaemon.com>, Natsayer says...
>
>Yuck, I hope I don't run up against any of your students, that's just uuuugly!

Yeah. That was the whole point.

OTOH, during the first semester, these first year students have so much else to grasp, that finding a missing break in their own code is just unnecessary grief. The second semester, when they've gained some confidence in their coding we'll revisit the case statement and introduce fall-through. (IF, by that time, I have become convinced that there exist real-world reasons to it, outside nifty textbook examples!)

I trust that the more talented students will find out how and when to use fall-through by themselves, but those who haven't might just be better off not using it. I'm also not going to teach about goto, by the same reason.

In the limited time of a class, ther are more important things to teach. After all, they're there to learn how to program, and no class should teach every detail there is to a language just because they exist.

I honestly believe D is an excellent first language. Maybe even
better than Pascal (which actually was created for this very
purpose)! D is also powerful enough to remain the main language
through university. The other "obligatory languages" (Java
C, C++, Lisp, whatever the particular university considers
essential) should remain class-specific.


December 06, 2003
In article <bqs1ft$16i2$1@digitaldaemon.com>, Walter says...
>"Ilya Minkov" <Ilya_member@pathlink.com> wrote in message

(an excellent discussion deleted)

If Walter succeeds in keeping the different phases of
compiling separate, then that opens up (even commercial)
possibilities for D. I could envision maybe getting an
Open Source lexer, a commercial optimizer, and the code
generator from a third place. Sort of like the OSI layer
thing.

This would of course only be possible if the layer interfaces were standardized.

Then I could choose between a "nagging" or a "strict" front
end, I could have an "errors only", "with warnings", "with
remarks", or "with informative hints" front end, depending
on whether the users are "absolute beginners" or "Ultra
Pro Gurus". (There's more to this than just a verbosity
level switch.)

Heck, a university could even write front ends for different semester classes, and _especially_ one for non-CS-professors for their own work!


For now, this implies that whatever error messages, or not,
D this year generates is not all that important at the
end of the day. Maybe we should concentrate more on actual
language issues and other things with real long-term
ramifications.


December 06, 2003
"Russ Lewis" <spamhole-2001-07-16@deming-os.org> wrote in message news:bqd01e$cc0$1@digitaldaemon.com...
> Walter, I just wanted to weigh in here.  Your current design (implicit default throws and exception) is, IMHO, a Good Thing.

At least somebody likes it <g>.

> Having the programmer add
> default: assert(0);
> is even better.

That's my normal practice in C/C++. Having that in there has saved me from countless bugs.


December 07, 2003
Walter wrote:

>"Russ Lewis" <spamhole-2001-07-16@deming-os.org> wrote in message
>news:bqd01e$cc0$1@digitaldaemon.com...
>  
>
>>Walter, I just wanted to weigh in here.  Your current design (implicit
>>default throws and exception) is, IMHO, a Good Thing.
>>    
>>
>
>At least somebody likes it <g>.
>
>  
>
>>Having the programmer add
>>default: assert(0);
>>is even better.
>>    
>>
>
>That's my normal practice in C/C++. Having that in there has saved me from
>countless bugs.
>
>
>  
>
I've been pondering this for a while, and I know that my view is going o be controversial...

Walter is absolutely RIGHT...

If you don't your switch recieves value that is nonsensical, you should have an error thrown (one that told you that is what happened would be nicer than assert(0) ).

If you add a value to a enum and miss one of the switches out in maintainance, then it will be exposed when you run your test suite, to which you added a test for complience with the revised enum (you do do write tests for your new code don't you <g>).

I cannot see a problem with Walter's choice of implementation that is not due to bad programming practices (even good programmers can have bad practices).

I myself am a bad programmer with good practices, aided by D's clever features...

Alix...

-- 
           Alix Pexton
Webmaster - http://www.theDjournal.com

           Alix@theDjournal.com

December 07, 2003
I thinkWalter's way is better than not checking at all, but a few people on this NG (including me) think it would be even better if the compiler just complained at you that not all the cases were handled.  If the compiler can tell that there are possible input values that aren't handled, it should require a default clause.  I do not believe that enums that were typecasted from ints should be considered.  If there is an error converting int to enum it should be exposed at the point of the cast, not worried about during a later switch.

switch (value)
{
    case 1:
        blah();
        break;
    case 2:
        foo();
        break;
    case 3:
        bar();
        break;
}

Now what the programmer *probably* meant was that if you get 1,2, or 3, do those things, otherwise do nothing.  But it's not explicit in the code.  We would want it to mandatorily be explicit.

Then if you want an assert, you put an assert, and if you do not, expecting C's rules to hold, you don't get burned by asserts thrown once your application ships, especially if the assert is thrown in a situation that otherwise would be perfectly gracefully handled (because you really *did* mean that if the case isn't matched, it should do nothing.)

Anytime there is a runtime check, there is a bug waiting to happen.  Make it so you don't need the check, and there is no possibility of a bug at all. That's better.

Sean

"Alix Pexton" <Alix@thedjournal.com> wrote in message news:bqvchc$l7l$1@digitaldaemon.com...
> I've been pondering this for a while, and I know that my view is going o be controversial...
>
> Walter is absolutely RIGHT...
>
> If you don't your switch recieves value that is nonsensical, you should
> have an error thrown (one that told you that is what happened would be
> nicer than assert(0) ).
>
> If you add a value to a enum and miss one of the switches out in maintainance, then it will be exposed when you run your test suite, to which you added a test for complience with the revised enum (you do do write tests for your new code don't you <g>).
>
> I cannot see a problem with Walter's choice of implementation that is not due to bad programming practices (even good programmers can have bad practices).
>
> I myself am a bad programmer with good practices, aided by D's clever features...
>
> Alix...