View mode: basic / threaded / horizontal-split · Log in · Help
March 25, 2010
An idea (Re: Implicit enum conversions are a stupid PITA)
Walter Bright wrote:
> Nick Sabalausky wrote:
>> To put it simply, I agree with this even on mere principle. I'm 
>> convinced that the current D behavior is a blatant violation of 
>> strong-typing and smacks way too much of C's so-called "type system".
> 
> You're certainly not the first to feel this way about implicit 
> conversions. Niklaus Wirth did the same, and designed Pascal with no 
> implicit conversions. You had to do an explicit cast each time.
> 
> Man, what a royal pain in the ass that makes coding in Pascal. 
> Straightforward coding, like converting a string of digits to an 
> integer, becomes a mess of casts. Even worse, casts are a blunt 
> instrument that *destroys* type checking (that wasn't so much of a 
> problem with Pascal with its stone age abstract types, but it would be 
> killer for D).
> 
> Implicit integral conversions are not without problems, but when I found 
> C I threw Pascal under the nearest bus and never wrote a line in it 
> again. The taste was so bad, I refused to even look at Modula II and its 
> failed successors.
> 
> D has 12 integral types. Disabling implicit integral conversions would 
> make it unbearable to use.

I think there might be some low-hanging fruit, though.
Supposed we distinguished enums containing AssignExpressions from those 
which do not.
It seems clear to me that logical operations should always be permitted 
on enums where every member of the enum has been explicitly assigned a 
value.
enum Enum1 { A = 1, B = 2, C = 4 }
 ---> A|B makes sense.

But if there are no assign expressions at all:
enum Enum2 { A, B, C }
then I think that performing arithmetic on that enum is almost certainly 
a bug.

The case where only some of the enum members have assign expressions is 
less clear. Some cases, such as synonyms { A, B, C=B } aren't really any 
different from the no-assign expression case. But other cases are less 
clear, so I'll just ignore them all. So I propose a simple rule:

Suppose that implicit integral conversions (and arithmetic/logical 
operations) involving enums were permitted ONLY if the enum has at least 
one AssignExpression.

This would catch some bugs, without (I think) causing much pain for 
valid code. At the very least, this is something I'd want in a lint 
tool. (But note that it wouldn't fix the issue in the original post).
March 25, 2010
Re: An idea (Re: Implicit enum conversions are a stupid PITA)
Don <nospam@nospam.com> wrote:

> I think there might be some low-hanging fruit, though.
> Supposed we distinguished enums containing AssignExpressions from those  
> which do not.
> It seems clear to me that logical operations should always be permitted  
> on enums where every member of the enum has been explicitly assigned a  
> value.
> enum Enum1 { A = 1, B = 2, C = 4 }
>   ---> A|B makes sense.
>
> But if there are no assign expressions at all:
> enum Enum2 { A, B, C }
> then I think that performing arithmetic on that enum is almost certainly  
> a bug.

I wonder, what if the default base type of an enum was simply 'enum', a
type not implicitly convertible to other types. If you want implicit
casts, specify the base type:

enum foo { A, B, C = B } // No base type, no conversions allowed.
enum bar : int { D, E, F = E } // An int in disguise. Allow conversions.

That seems to follow D's tenet that the unsafe should be more verbose.

-- 
Simen
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
Nick Sabalausky wrote:
> Actually, with "bitfields", I've been mostly referring to pretty much just 
> that: doing manual bit-twiddling, typically aided by manifest constants 
> and/or enums, and taking the stance that doing that could use a better (ie, 
> more abstracted and more type-safe) interface (while still keeping the same 
> under-the-hood behavior).
> 
> Maybe it's all the low-level stuff I've done, but any time I come across the 
> term "bitfield" I instinctively envision those abstract rows of labeled 
> "bit" squares (or differently-sized rectangles) that you see in spec sheets 
> for digital hardware (ie, the abstract concept of a small piece of memory 
> having bit-aligned data), rather than specifically the 
> structs-with-sub-byte-member-alignment that I keep forgetting C has. I can't 
> really comment on that latter kind as I've never really used them (can't 
> remember why not), although I can easily believe that they may be 
> insufficient for the job. Maybe that difference is where the disagreement 
> between me and Andrei arose.


It does seem we've totally misunderstood each other. Yes, I was referring to 
(and I'm sure Andrei was as well) the C bitfield language feature.
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
Nick Sabalausky wrote:
> Hmm...That's odd...I guess I'll have to dig closer to see what's going on. 
> The discussion on "D.learn" didn't seem to indicate that that above should 
> work, so I hastily ruled out the possibility of something else going on in 
> my code. It's too late for me to think stright ATM, so I'll have to check on 
> that in the morning.


This kind of misunderstanding comes up all the time, it's why having exact 
reproducible snippets of code is so important! Often the critical detail is 
something that was omitted because the person thought it was either obvious or 
irrelevant.
March 25, 2010
Re: An idea (Re: Implicit enum conversions are a stupid PITA)
Don wrote:
> This would catch some bugs, without (I think) causing much pain for 
> valid code. At the very least, this is something I'd want in a lint 
> tool. (But note that it wouldn't fix the issue in the original post).


Such rules are interesting, but I worry that they are both too clever and too 
obtuse. Having a lot of such rules may make using D very frustrating because 
you'll never know when you'll get hit by one of them.

I'd rather have a simpler, more orthogonal, and easier to remember rules that 
may have a downside here and there rather than a complex web of seemingly 
arbitrary ones.
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
yigal chripun wrote:
> here's a simple version without casts: int toString(dchar[] arr) { int temp =
> 0; for (int i = 0; i < arr.length; i++) { int digit = arr[i].valueOf - 30; //
> * if (digit < 0 || digit > 9) break; temp += 10^^i * digit; } return temp; }
> 
> [*] Assume that dchar has a valueOf property that returns the value.
> 
> where's that mess of casts you mention?

In Pascal, you'd have type errors all over the place. First off, you cannot do
arithmetic on characters. You have to cast them to integers (with the ORD(c)
construction).


> Pascal is hardly the only language without excplicit casts.

Pascal has explicit casts. The integer to character one is CHR(i), the character 
to integer is ORD(c).


> ML is also
> properly strongly typed and is an awesome language to use.

I don't know enough about ML to comment intelligently on it.


> The fact that D has 12 integral types is a bad design, why do we need so many
> built in types? to me this clearly shows a need to refactor this aspect of D.

Which would you get rid of? (13, I forgot bool!)

bool
byte
ubyte
short
ushort
int
uint
long
ulong
char
wchar
dchar
enum
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
Walter Bright wrote:
> Nick Sabalausky wrote:
>> To put it simply, I agree with this even on mere principle. I'm 
>> convinced that the current D behavior is a blatant violation of 
>> strong-typing and smacks way too much of C's so-called "type system".
> 
> You're certainly not the first to feel this way about implicit 
> conversions. Niklaus Wirth did the same, and designed Pascal with no 
> implicit conversions. You had to do an explicit cast each time.
> 
> Man, what a royal pain in the ass that makes coding in Pascal. 
> Straightforward coding, like converting a string of digits to an 
> integer, becomes a mess of casts. Even worse, casts are a blunt 
> instrument that *destroys* type checking (that wasn't so much of a 
> problem with Pascal with its stone age abstract types, but it would be 
> killer for D).

That's funny that you're saying this. Casts are totally messed up in D. 
Some casts do safe operations (casts between objects and interfaces), 
some are absolutely ridiculous and only useful in low level situations 
(casting array slices), some are safe whenever the compiler feels like 
it (array casts of array literals versus array slices), and some 
fundamentally break other language features, even accidentally (immutable).

casts are easily grepable, but you never know what a specific cast 
actually does. Think about what this means for generic templated code.

In summary, I'd say casting rules in D are the worst spawn of hell.

I mean, that's ok, it doesn't exactly make D useless. And you can always 
introduce your own safe (exe bloating, sigh) casting template functions. 
But I still find it funny that you say this.

> Implicit integral conversions are not without problems, but when I found 
> C I threw Pascal under the nearest bus and never wrote a line in it 
> again. The taste was so bad, I refused to even look at Modula II and its 
> failed successors.

For your information, programming in Delphi (modern Pascal dialect) was 
quite a joy. It combined the advantages of the low level programming of 
C (pointers, inline assembler), was safer than C, and included a sane 
object model similar to Java/C#. Sounds familiar?

(Yeah, the template fetishists must have been very unhappy with it.)

I really don't understand why you're bashing Pascal at large. You must 
have had only experience with early Pascal dialects... and then never 
looked at anything that smelled like Pascal... and this as a language 
designer??

> D has 12 integral types. Disabling implicit integral conversions would 
> make it unbearable to use.


PS: while you guys are talking about new "absolutely necessary" language 
features, people new to D are despairing to get a working D installation 
and *trying* to use external libraries from dsource (that kind of 
struggling really isn't nice to watch), and people "old" to D are 
desparing over compiler regressions and random bugs that have global 
effects on middle-sized to large codebases (circular dependency and 
optlink bugs come to mind). The situation is slowly improving, but too 
slow and problems *never* get completely eliminated.

/rant
Well, I'm out of here.
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
Walter Bright:
>Disabling implicit integral conversions would make it unbearable to use.<

Implicit conversion from signed to unsigned is too much unsafe (if you don't have integer overflows).
C# can teach us two lessons on this. C# has disabled some implicit conversions and keeps other of them.

Bye,
bearophile
March 25, 2010
Re: An idea (Re: Implicit enum conversions are a stupid PITA)
Walter Bright:
> Such rules are interesting, but I worry that they are both too clever and too 
> obtuse. Having a lot of such rules may make using D very frustrating because 
> you'll never know when you'll get hit by one of them.
> 
> I'd rather have a simpler, more orthogonal, and easier to remember rules that 
> may have a downside here and there rather than a complex web of seemingly 
> arbitrary ones.

I agree a lot, and I agree Don was wrong here.

In my posts I have already shown a better solution, that's similar to the C# one:

enum Foo {...} // normal enum, no implicit conversions, no operators
@flags enum Foo {...} // no implicit conversions, boolean operator plus "in".

Bye,
bearophile
March 25, 2010
Re: Implicit enum conversions are a stupid PITA
yigal chripun:
> The fact that D has 12 integral types is a bad design, why do we need so many built in types? to me this clearly shows a need to refactor this aspect of D. 

In a system language you need them all, and you need multi-precision integers too.
Note: C# too has unsigned types, but they are separated :-)

Bye,
bearophile
1 2 3 4 5 6 7 8
Top | Discussion index | About this forum | D home