December 08, 2007
Janice Caron wrote:
> On 12/7/07, Leandro Lucarella <llucax@gmail.com> wrote:
>> That's the worst reason ever! There are so many things we already use that
>> sucks...
> 
> Yep, I agree with everyone. Especially with the above comment. I will
> certainly /stop/ writing enum { x=3 } if a more intuitive way comes
> along.

Why isn't:
	enum x = 3;
more intuitive?

> So there you have it, Walter: Unanimous support on this newsgroup
> (...is that unprecedented?...) for NOT using enum as a storage class
> to define compile-time constants.
> 
> We all seem to be cool with the concept, just not with the word
> "enum". My fave is "final", but I'd be happy with any of the other
> alternatives that have been suggested so far.

Let's look at final for a moment. Final is currently a storage class for member functions, as in:
	final
	{
		int foo();
		int bar();
	}
but yet:
	final x = y;
is proposed. This doesn't work too well in the syntax, as we don't have:
	typedef
	{
		int myint;
	}
either, would we really want:
	final
	{
		x = y;
	}
? I don't think that looks right. alias also has strange syntactical problems already discussed, like does:
	alias int x = 3;
make any intuitive sense? Why does:
	final int x = 3;
make any more intuitive sense than:
	enum int x = 3;
? And lastly, since anonymous enumerated constants are already just what we need, and the proposed new enum variation is just a syntactic shorthand for an anonymous enum with one member, what is the intuitive argument for when one should use a final and when one should use an enum?
December 08, 2007
Walter Bright wrote:
> Janice Caron wrote:
>> On 12/7/07, Leandro Lucarella <llucax@gmail.com> wrote:
>>> That's the worst reason ever! There are so many things we already use that
>>> sucks...
>>
>> Yep, I agree with everyone. Especially with the above comment. I will
>> certainly /stop/ writing enum { x=3 } if a more intuitive way comes
>> along.
> 
> Why isn't:
>     enum x = 3;
> more intuitive?

Because you're *not* *enumerating* anything.

From wikipedia:
"""
In computer programming, an enumerated type is an abstract data type used to model an attribute that has a specific number of options (or identifiers) such as the suit of a playing card (i.e. a Club, Diamond, Heart or Spade). Using this type allows the program to handle the attribute more efficiently than a string while maintaining the readability of the source code.
"""


> Let's look at final for a moment. Final is currently a storage class for member functions, as in:
>     final
>     {
>         int foo();
>         int bar();
>     }
> but yet:
>     final x = y;
> is proposed. This doesn't work too well in the syntax, as we don't have:
>     typedef
>     {
>         int myint;
>     }
> either, would we really want:
>     final
>     {
>         x = y;
>     }

Actually I kind of like that block typedef!  Often templated classes start off with a preamble of aliases.  It would be nice to put them all in a block.  (But it would be easier to read if it could be done with the x = y style, since "int myint" looks like it's declaring an integer variable).

> ? I don't think that looks right. alias also has strange syntactical problems already discussed, like does:
>     alias int x = 3;
> make any intuitive sense? 

Sure! It means

  alias x = cast(int)3;

it's an alias for the literal 3 that you're specifying you would like the compiler to treat as an int.  But since it acts like a variable you can declare it using a variant of standard variable syntax.

> Why does:
>     final int x = 3;
> make any more intuitive sense than:
>     enum int x = 3;
> ? 

There are these things called "words".  And they have "meanings"...
enum: (short for "enumeration", the noun form of "enumerate")
   "to specify one after another : list"
final:
   "not to be altered or undone <all sales are final>"

(definitions courtesy m-w.com)

> And lastly, since anonymous enumerated constants are already just what we need, 

Enumerated constants are *not* what we need.  We need manifest constants.  We're not enumerating anything!  We're just trying to declare shorthand name for a constant value.

> and the proposed new enum variation is just a syntactic shorthand for an anonymous enum with one member, 

That's the only thing using 'enum' has going for it.

> what is the intuitive argument for when one should use a final and when one should use an enum?

The description in wikipedia is decent:
http://en.wikipedia.org/wiki/Enumerated_type

C's enumerated types already bastardized the concept a bit by allowing specific values to be assigned.  D goes further in some ways by allowing you to specify a type for the enum as well.  And this new proposal is like the nail through the heart of any vestigial meaning remaining in the word 'enum', either mathematical or layman's.

--bb
December 08, 2007
Bill Baxter wrote:
> Walter Bright wrote:

>> and the proposed new enum variation is just a syntactic shorthand for an anonymous enum with one member, 

The main reason for enum to exist and to be called "enum" in the first place is because of its behavior of automatically assigning numeric values to a list of symbols.  The purpose for enums originally in C was to provide symbolic sets.  Like:

   enum { DIAMOND, HEART, CLUB, SPADE };

It doesn't really matter what the values are.  And in fact in some languages like OCaml I think you can't even ask what the value of DIAMOND is.  It's just a symbol that compares equal with DIAMOND and unequal with HEART, CLUB, and SPADE.  And I seem to recall they are rather proud of that fact that they have "real" enumerated types.

Rather than further watering down the meaning of "enum" I think it would make more sense to scale it back to being a simple automatically numbered list of symbols.  And let alias take all the other uses.

// manifest constant (literal/constant alias)
alias int Foo = 10;

// Group of manifest constants (literal/constant alias)
alias {
   int Foo = 10;
   int Bar = 20;
}

// Alias for type
alias float_t = float;

// Alias for type (nod to C syntax for people converting code)
alias float float_t;

// Group of type aliases
alias {
    float_t = float;
    int_t = int;
    vector_t = vec!(int);
}

// module aliases
alias math = std.math;

// module aliases (legacy syntax. keep or phase out slowly)
alias std.math math;

// Mix n match!
alias {
    float_t = float;
    float PI = 3.14159;
    math = std.math;
}

They all have the common meaning of "let X be a new name for Y". Actually it becomes a lot like a safe version #define.

And then you could leave enums for the things where you really do want the automatic numbering behavior that it's named after.

enum Suit {
   HEART,DIAMOND,SPADE,CLUB
}

--bb
December 08, 2007
On 12/8/07, Walter Bright <newshound1@digitalmars.com> wrote:
> Why isn't:
>         enum x = 3;
> more intuitive?

I didn't say it wasn't intutive. /Anything/ is intuitive, once you get used to it, if the syntax is simple enough, and I can't argue that this isn't simple.

I said it was the wrong word. "enum" is short for "enumeration". It should be used for the purpose of enumeration, and nothing more. Honestly, I wish D had /real/ enumerations, so you could do

    enum Col { red, orange, yellow, green, blue, indigo, violet };

to indiciate that Col was a type defining an ordered sequence of symbols. It has a /first/ element (red), a last element (violet), a successor/precedessor relationship (the successor of yellow is green), and comparison functions whose meaning is "left of" and "right of". Elements would have a toString() function, so that orange.toString == "orange". But what these beasts would NOT have any numerical properties at all. Converting to or from int is just not possible. You know - /proper/ enums! (Of course, that's not going to happen, so this is not a feature request. The current implementation lets me /pretend/ that enums are proper enums, and that's probably good enough).

We all love the idea of compile-time constants. They're a /great/ idea. Just, please use the right word to express them. If you don't like "final", not a problem. Others have suggested "alias", "macro", "define" and "let", and any one of those would make me happy.
December 08, 2007
On 12/8/07, Bill Baxter <dnewsgroup@billbaxter.com> wrote:
> There are these things called "words".  And they have "meanings"...

Yes, that's /exactly/ the point I was making. Well said, Bill.
1 2 3
Next ›   Last »