Thread overview | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
December 03, 2007 typedefs are useless | ||||
---|---|---|---|---|
| ||||
This may have arisen somewhere before, but... Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long. I can do: typedef long mytype; However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do: mytype x = cast(mytype)6L; And if I want to scale a mytype value, I can do: x *= 6; but if I wanted to create a scaled value of another mytype value, this is illegal: x = y * 6; So in order to get this to work, I have to do: x = y * cast(mytype)6; which is really cumersome, when what I want is a integral type that is semantically different from the integral type it's based on. But I want literals of the base type to implicitly convert to my type so I don't have these ugly casts. Note that an enum is EXACTLY the same as a typedef, with the added feature that an enum can have const values of itself accessible as members. In that regard, why would anyone use a typedef when it is a less-functional version of an enum? in short, I would like the compiler to auto promote a literal to a typedef that was based on that literal's type. Not sure if this is feasible, but without something like this, there is no reason to have a typedef keyword. I can use enum instead. -Steve |
December 03, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | Steven Schveighoffer wrote: > This may have arisen somewhere before, but... > > Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long. > > I can do: > > typedef long mytype; > > However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do: > > mytype x = cast(mytype)6L; > > And if I want to scale a mytype value, I can do: > > x *= 6; > > but if I wanted to create a scaled value of another mytype value, this is illegal: > > x = y * 6; > > So in order to get this to work, I have to do: > > x = y * cast(mytype)6; > > which is really cumersome, when what I want is a integral type that is semantically different from the integral type it's based on. But I want literals of the base type to implicitly convert to my type so I don't have these ugly casts. > > Note that an enum is EXACTLY the same as a typedef, with the added feature that an enum can have const values of itself accessible as members. In that regard, why would anyone use a typedef when it is a less-functional version of an enum? > > in short, I would like the compiler to auto promote a literal to a typedef that was based on that literal's type. Not sure if this is feasible, but without something like this, there is no reason to have a typedef keyword. I can use enum instead. > > -Steve > > You can typedef things other than integral types. -- Kirk McDonald http://kirkmcdonald.blogspot.com Pyd: Connecting D and Python http://pyd.dsource.org |
December 03, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Kirk McDonald | "Kirk McDonald" wrote
>
> You can typedef things other than integral types.
OK, so that is one difference between enum and typedef (I forgot about that, because I was so narrowly focused on my problem at hand).
but the same thing can be said for non-integral types:
typedef mytype float;
mytype x = 1.0 // does not work.
This typedef is just as useless as an integral typedef. The fact that enum does not exist for a floating point type does not justify the existance of typedef. You can use alias to make a more useful type, but you then do not have the restrictions I am looking for.
I look at a typedef as a useful way to create a derived type from a builtin type such that it is implicitly convertable to the base type, but not in reverse (similar to deriving from a base class). However, without the ability to specify literals, or even extend the syntax to be able to specify them (i.e. some way to make 1mt mean cast(mytype)1), the type is only useful as an enumeration, as any mathematical manipulation will require lots of casting statements, just like an enum.
-Steve
|
December 03, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | Steven Schveighoffer wrote: > I can do: > > typedef long mytype; > > However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do: > > mytype x = cast(mytype)6L; <snip> Complained about in Issue 1335 and earlier. This is why I use aliases instead of typedefs in all but the shortest of programs. -- E-mail address: matti.niemenmaa+news, domain is iki (DOT) fi |
December 03, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | Steven Schveighoffer wrote:
> This may have arisen somewhere before, but...
>
> Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long.
[...]
this is getting really close to something I have wanted for some time:
typedef real MyReal
{
// stuff
}
it would be like a struct that inherits from a primitive type. The this would be of the base type and you wouldn't be allowed to add any members. However this would allow you to do things like overload the operators. The one things in particular I would like to do would be to overload just the typing of the operators. This would result in the implementation of + (for instance) being the same as + on the underlying type, but the type of the result would be redefined. The point of this would be to allow a program to restrict the type that could be used. A concrete use case would be a SIUinits type that would, at compile time, verify unit correctness, but in the end would result in exactly the same code as if bare FP values were used.
Thoughts?
|
December 03, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to BCS | BCS wrote:
> Steven Schveighoffer wrote:
>> This may have arisen somewhere before, but...
>>
>> Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long.
> [...]
>
> this is getting really close to something I have wanted for some time:
>
> typedef real MyReal
> {
> // stuff
> }
>
> it would be like a struct that inherits from a primitive type. The this would be of the base type and you wouldn't be allowed to add any members. However this would allow you to do things like overload the operators. The one things in particular I would like to do would be to overload just the typing of the operators. This would result in the implementation of + (for instance) being the same as + on the underlying type, but the type of the result would be redefined. The point of this would be to allow a program to restrict the type that could be used. A concrete use case would be a SIUinits type that would, at compile time, verify unit correctness, but in the end would result in exactly the same code as if bare FP values were used.
>
> Thoughts?
Why not just make that behavior triggered when you do
struct MyReal : real
{
...
}
?
|
December 04, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | Steven Schveighoffer wrote:
> Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long.
>
> I can do:
>
> typedef long mytype;
>
> However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do:
>
> mytype x = cast(mytype)6L;
FWIW, Ada solves this problem by considering literals in a special type called "universal integer." It's special because you can't actually declare any variables of that type. However, universal integers can be implicitly converted to other types derived from Integer. So, in Ada it looks like this
type My_Type is range 0..10 -- Or whatever range you need.
X : My_Type;
Y : Integer;
...
X := Y; -- Type mismatch. Compile error.
X := 1; -- Fine. Universal integer converts to My_Type.
This sounds like what you want for D. Note, by the way, that the range constraint on a type definition in Ada must be static. Thus the compiler can always tell if the value of the universal integer (which can only be a literal) is in the right range.
Ada also has a concept of universal float to deal with float point literals in a similar way.
Peter
|
December 04, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Peter C. Chapin | Peter C. Chapin wrote: > Steven Schveighoffer wrote: > >> Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long. >> >> I can do: >> >> typedef long mytype; >> >> However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do: >> >> mytype x = cast(mytype)6L; > > FWIW, Ada solves this problem by considering literals in a special type called "universal integer." It's special because you can't actually declare any variables of that type. However, universal integers can be implicitly converted to other types derived from Integer. So, in Ada it looks like this > > type My_Type is range 0..10 -- Or whatever range you need. > > X : My_Type; > Y : Integer; > > ... > > X := Y; -- Type mismatch. Compile error. > X := 1; -- Fine. Universal integer converts to My_Type. > > This sounds like what you want for D. Note, by the way, that the range constraint on a type definition in Ada must be static. Thus the compiler can always tell if the value of the universal integer (which can only be a literal) is in the right range. > > Ada also has a concept of universal float to deal with float point literals in a similar way. > > Peter Behavior like this would come about as a result of polysemous values being added to D, yes? -- ~John Demme me@teqdruid.com |
December 04, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to Peter C. Chapin | IMO, in this case ADA is slicker than D.
I remember that f.i.
type Natural is range 0..255
is quit often in use. (long time ago that I learned a bit ADA)
Having quality assurance and *Reliable Software in mind, then the ADA way much is smarter than using D's assert() or DBC instead.
Hope we'll see this feature in D, maybe it is worth a Feature Request.
Bjoern
Peter C. Chapin schrieb:
> Steven Schveighoffer wrote:
>
>> Let's say I want a way to create a type that's like a long, but is not implicitly convertable from a long.
>>
>> I can do:
>>
>> typedef long mytype;
>>
>> However, I can't create literals of this type. so if I want to initialize a mytype value to 6, I have to do:
>>
>> mytype x = cast(mytype)6L;
>
> FWIW, Ada solves this problem by considering literals in a special type
> called "universal integer." It's special because you can't actually
> declare any variables of that type. However, universal integers can be
> implicitly converted to other types derived from Integer. So, in Ada it
> looks like this
>
> type My_Type is range 0..10 -- Or whatever range you need.
>
> X : My_Type;
> Y : Integer;
>
> ...
>
> X := Y; -- Type mismatch. Compile error.
> X := 1; -- Fine. Universal integer converts to My_Type.
>
> This sounds like what you want for D. Note, by the way, that the range
> constraint on a type definition in Ada must be static. Thus the compiler
> can always tell if the value of the universal integer (which can only be
> a literal) is in the right range.
>
> Ada also has a concept of universal float to deal with float point
> literals in a similar way.
>
> Peter
|
December 04, 2007 Re: typedefs are useless | ||||
---|---|---|---|---|
| ||||
Posted in reply to BLS | BLS wrote:
> IMO, in this case ADA is slicker than D.
> I remember that f.i.
> type Natural is range 0..255
> is quit often in use.
Actually that's
subtype Natural is Integer range 0 .. Integer'Last;
The values in the type are all the non-negative integers. Also it's a subtype so it can be freely mixed with its parent type (Integer). However, a run time check will be added when necessary to verify the range constraint.
Sorry about the off topic post. I just felt the need to clarify that point. :-)
Peter
|
Copyright © 1999-2021 by the D Language Foundation