Thread overview
Converting char to int
Jan 01, 2014
Caeadas
Jan 01, 2014
thedeemon
Jan 01, 2014
Meta
Jan 01, 2014
Caeadas
Jan 02, 2014
monarch_dodra
Jan 01, 2014
Marco Leise
Jan 01, 2014
Meta
January 01, 2014
I hope you'll forgive my asking an overly easy question: I've
been searching for an answer and having trouble finding one, and
am getting frustrated.

My issue is this: I'm trying to convert character types to
integers using to!int from std.conv, and I'm getting, for
example, '0'->48, '1'->49, etc. It seems like there should be a
simple way around this, but it's eluding me.
January 01, 2014
On Wednesday, 1 January 2014 at 06:21:05 UTC, Caeadas wrote:
> My issue is this: I'm trying to convert character types to
> integers using to!int from std.conv, and I'm getting, for
> example, '0'->48, '1'->49, etc. It seems like there should be a
> simple way around this, but it's eluding me.

Well, in ASCII (as well as UTF-8 and many others) the character '0' is represented by number 48, character '1' is 49 etc., so if you want to convert '5' to 5 you just need to subtract '0's number from it, e.g.
char c = ...;
int x = c - '0';
January 01, 2014
On Wednesday, 1 January 2014 at 06:21:05 UTC, Caeadas wrote:
> I hope you'll forgive my asking an overly easy question: I've
> been searching for an answer and having trouble finding one, and
> am getting frustrated.
>
> My issue is this: I'm trying to convert character types to
> integers using to!int from std.conv, and I'm getting, for
> example, '0'->48, '1'->49, etc. It seems like there should be a
> simple way around this, but it's eluding me.

Your code is working correctly. D's chars, for all values up to 255, are the same as the ASCII character set. If you look in the ASCII table here: http://www.asciitable.com/ you will see that '0' corresponds to 48, and '1' corresponds to 49. The easiest solution that will fix your code is change to!int('0') and to!int('1') to to!int("0") and to!int("1"). It seems that for characters the to! function just returns its ASCII value, but it will definitely do what you want for strings. Strangely enough, it doesn't seem like there's a function to do convert a character to its numeric value in Phobos, but it's simple to implement.

int charToInt(char c)
{
    return (c >= 48 && c <= 57) ? cast(int)(c - 48) : -1;
}

void main()
{
    assert(charToInt('0') == 0);
    assert(charToInt('1') == 1);
    assert(charToInt('9') == 9);
    assert(charToInt('/') == -1);
    assert(charToInt(':') == -1);
}
January 01, 2014
Thanks much for the help, both of you :). I thought there might
be a very simple way to do this, since it's so intuitive to
change '4' to 4. I've basically just been subtracting 48 from
everything, but I suppose aesthetically it's a bit nicer to
convert from string instead.
January 01, 2014
Am Wed, 01 Jan 2014 07:27:54 +0000
schrieb "Meta" <jared771@gmail.com>:

> Your code is working correctly. D's chars, for all values up to 255, are the same as the ASCII character set.

UTF-8 reuses the ASCII mapping which is only defined from 0 to 127. Everything above is not ASCII and 255 is in fact not even defined for UTF-8, which is why it was chosen as the initializer for char in D.

-- 
Marco

January 01, 2014
On Wednesday, 1 January 2014 at 20:08:24 UTC, Marco Leise wrote:
> Am Wed, 01 Jan 2014 07:27:54 +0000
> schrieb "Meta" <jared771@gmail.com>:
>
>> Your code is working correctly. D's chars, for all values up to 255, are the same as the ASCII character set.
>
> UTF-8 reuses the ASCII mapping which is only defined from 0 to
> 127. Everything above is not ASCII and 255 is in fact not even
> defined for UTF-8, which is why it was chosen as the
> initializer for char in D.

Yes, I didn't want to cause confusion talking about ASCII vs. Unicode.
January 02, 2014
On Wednesday, 1 January 2014 at 20:01:56 UTC, Caeadas wrote:
> Thanks much for the help, both of you :). I thought there might
> be a very simple way to do this, since it's so intuitive to
> change '4' to 4.

There have been talks about it, but it hasn't been implemented yet. The idea was to provide the function for ascii *once* we've implemented it in unicode.

> I've basically just been subtracting 48 from
> everything, but I suppose aesthetically it's a bit nicer to
> convert from string instead.

Don't substract 48. That's just a number! Substract '0' :)

int charToInt(char c)
{
    c -= '0';
    return c <= 9 ? c : -1;
}