November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Michael V. Franklin | On Tuesday, 14 November 2017 at 13:43:32 UTC, Michael V. Franklin wrote: > Nick, if it's not in bugzilla already, can you please add it? Sure: https://issues.dlang.org/show_bug.cgi?id=17983 |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote:
> IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this.
Is everyone in general agreement on this? Can anyone think of a compelling use case?
Mike
|
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Michael V. Franklin | On Tue, Nov 14, 2017 at 11:05:51PM +0000, Michael V. Franklin via Digitalmars-d wrote: > On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote: > > > IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this. > > Is everyone in general agreement on this? Can anyone think of a compelling use case? [...] I am 100% for this change. I've been bitten before by things like this: void myfunc(char ch) { ... } void myfunc(int i) { ... } char c; int i; myfunc(c); // calls first overload myfunc('a'); // calls second overload (WAT) myfunc(i); // calls second overload myfunc(1); // calls second overload There is no compelling use case for implicitly converting char types to int. If you want to directly manipulate ASCII values / Unicode code point values, a direct cast is warranted (clearer code intent). Converting char to wchar (or dchar, or vice versa, etc.) implicitly is also fraught with peril: if the char happens to be an upper byte of a multibyte sequence, you *implicitly* get a garbage value. Not useful at all. Needing to write an explicit cast will remind you to think twice, which is a good thing. T -- Famous last words: I wonder what will happen if I do *this*... |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Nick Treleaven | On Tuesday, 14 November 2017 at 13:20:22 UTC, Nick Treleaven wrote:
> An very similar problem exists for int and char overloads:
>
> alias foo = (char c) => 1;
> alias foo = (int i) => 4;
>
> enum int e = 7;
> static assert(foo(e) == 4); // fails
Wait a minute! This doesn't appear to be a casting or overload problem. Can you really overload aliases in D?
I would expect the compiler to throw an error as `foo` is being redefined. Or for `foo` to be replaced by the most recent assignment in lexical order. Am I missing something?
Mike
|
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Michael V. Franklin | On 11/14/17 6:14 PM, Michael V. Franklin wrote: > On Tuesday, 14 November 2017 at 13:20:22 UTC, Nick Treleaven wrote: > >> An very similar problem exists for int and char overloads: >> >> alias foo = (char c) => 1; >> alias foo = (int i) => 4; >> >> enum int e = 7; >> static assert(foo(e) == 4); // fails > > Wait a minute! This doesn't appear to be a casting or overload problem. Can you really overload aliases in D? In fact, I'm surprised you can alias to an expression like that. Usually you need a symbol. It's probably due to how this is lowered. Indeed, this is a completely different problem: enum int e = 500; static assert(foo(e) == 4); // fails to compile (can't call char with 500) If you define foo as an actual overloaded function set, it works as expected. > > I would expect the compiler to throw an error as `foo` is being redefined. Or for `foo` to be replaced by the most recent assignment in lexical order. Am I missing something? In this case, the compiler simply *ignores* the newest definition. It should throw an error IMO. -Steve |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Michael V. Franklin | On 11/14/2017 06:05 PM, Michael V. Franklin wrote:
> On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote:
>
>> IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this.
>
> Is everyone in general agreement on this? Can anyone think of a compelling use case?
No, that would be too large a change of the rules. FWIW 'a' has type dchar, not char. -- Andrei
|
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to H. S. Teoh | On 11/14/17 6:09 PM, H. S. Teoh wrote: > On Tue, Nov 14, 2017 at 11:05:51PM +0000, Michael V. Franklin via Digitalmars-d wrote: >> On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote: >> >>> IMO, no character types should implicitly convert from integer >>> types. In fact, character types shouldn't convert from ANYTHING >>> (even other character types). We have so many problems with this. >> >> Is everyone in general agreement on this? Can anyone think of a >> compelling use case? > [...] > > I am 100% for this change. I've been bitten before by things like this: > > void myfunc(char ch) { ... } > void myfunc(int i) { ... } > > char c; > int i; > > myfunc(c); // calls first overload > myfunc('a'); // calls second overload (WAT) > myfunc(i); // calls second overload > myfunc(1); // calls second overload I couldn't believe that this is the case so I tested it: https://run.dlang.io/is/AHQYtA for those who don't want to look, it does indeed call the first overload for a character literal, so this is not a problem (maybe you were thinking of something else?) > There is no compelling use case for implicitly converting char types to > int. If you want to directly manipulate ASCII values / Unicode code > point values, a direct cast is warranted (clearer code intent). I think you misunderstand the problem. It's fine for chars to promote to int, or even bools for that matter. It's the other way around that is problematic. To put it another way, if you make this require a cast, you will have some angry coders :) if(c >= '0' && c <= '9') value = c - '0'; > Converting char to wchar (or dchar, or vice versa, etc.) implicitly is > also fraught with peril: if the char happens to be an upper byte of a > multibyte sequence, you *implicitly* get a garbage value. Not useful at > all. Needing to write an explicit cast will remind you to think twice, > which is a good thing. Agree, these should require casts, since the resulting type is probably not what you want in all cases. Where this continually comes up is char ranges. Other than actual char[] arrays, the following code doesn't do the right thing at all: foreach(dchar d; charRange) If we made it require a cast, this would find such problems easily. -Steve |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Steven Schveighoffer | On Tue, Nov 14, 2017 at 06:53:43PM -0500, Steven Schveighoffer via Digitalmars-d wrote: > On 11/14/17 6:09 PM, H. S. Teoh wrote: > > On Tue, Nov 14, 2017 at 11:05:51PM +0000, Michael V. Franklin via Digitalmars-d wrote: > > > On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote: > > > > > > > IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this. > > > > > > Is everyone in general agreement on this? Can anyone think of a compelling use case? > > [...] > > > > I am 100% for this change. I've been bitten before by things like this: > > > > void myfunc(char ch) { ... } > > void myfunc(int i) { ... } > > > > char c; > > int i; > > > > myfunc(c); // calls first overload > > myfunc('a'); // calls second overload (WAT) > > myfunc(i); // calls second overload > > myfunc(1); // calls second overload > > I couldn't believe that this is the case so I tested it: > > https://run.dlang.io/is/AHQYtA > > for those who don't want to look, it does indeed call the first overload for a character literal, so this is not a problem (maybe you were thinking of something else?) [...] Argh, should've checked before I posted. What I meant was more something like this: import std.stdio; void f(dchar) { writeln("dchar overload"); } void f(ubyte) { writeln("ubyte overload"); } void main() { f(1); f('a'); } Output: ubyte overload ubyte overload It "makes sense" from the POV of C/C++-compatible integer promotion rules, but in the context of D, it's just very WAT-worthy. T -- Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. -- Brian W. Kernighan |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Andrei Alexandrescu | On 11/14/17 6:48 PM, Andrei Alexandrescu wrote: > On 11/14/2017 06:05 PM, Michael V. Franklin wrote: >> On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote: >> >>> IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this. >> >> Is everyone in general agreement on this? Can anyone think of a compelling use case? > > No, that would be too large a change of the rules. All it means is that when VRP allows it, you still have to cast. It's not that large a change actually, but I can see how it might be too disruptive to be worth it. > FWIW 'a' has type dchar, not char. -- Andrei > No: pragma(msg, typeof('a')); // char -Steve |
November 14, 2017 Re: Deprecate implicit `int` to `bool` conversion for integer literals | ||||
---|---|---|---|---|
| ||||
Posted in reply to Michael V. Franklin | On 11/14/17 6:05 PM, Michael V. Franklin wrote:
> On Tuesday, 14 November 2017 at 13:54:03 UTC, Steven Schveighoffer wrote:
>
>> IMO, no character types should implicitly convert from integer types. In fact, character types shouldn't convert from ANYTHING (even other character types). We have so many problems with this.
>
> Is everyone in general agreement on this? Can anyone think of a compelling use case?
I would think this is another DIP from the one you are looking at, as it is more far-reaching than just overload problems. They are real problems, but this makes the DIP scope broader than it should be, and lessen the chance of acceptance.
-Steve
|
Copyright © 1999-2021 by the D Language Foundation