Thread overview
Converting int to dchar?
Jul 31, 2016
Darren
Jul 31, 2016
Seb
Jul 31, 2016
Johannes Loher
Jul 31, 2016
ag0aep6g
Jul 31, 2016
Darren
July 31, 2016
Hey, all.

I'm pretty much a programming novice, so I hope you can bear with me.  Does anyone know how I can change an int into a char equivalent?

e.g.
int i = 5;
dchar value;
?????
assert(value == '5');

If I try and cast it to dchar, I get messed up output, and I'm not sure how to use toChars (if that can accomplish this).

I can copy+paste the little exercise I'm working on if that helps?

Thanks in advance!
July 31, 2016
On Sunday, 31 July 2016 at 21:31:52 UTC, Darren wrote:
> Hey, all.
>
> I'm pretty much a programming novice, so I hope you can bear with me.  Does anyone know how I can change an int into a char equivalent?
>
> e.g.
> int i = 5;
> dchar value;
> ?????
> assert(value == '5');
>
> If I try and cast it to dchar, I get messed up output, and I'm not sure how to use toChars (if that can accomplish this).
>
> I can copy+paste the little exercise I'm working on if that helps?
>
> Thanks in advance!

Ehm how do you you want to represent 1_000 in one dchar?
You need to format it, like here.

    import std.format : format;
    assert("%d".format(1_000) == "1000");

Note that you get an array of dchars (=string), not a single one.
July 31, 2016
On 07/31/2016 11:31 PM, Darren wrote:
> If I try and cast it to dchar, I get messed up output,

Because it gives you a dchar with the numeric value 5 which is some control character.

> and I'm not sure
> how to use toChars (if that can accomplish this).

    value = i.toChars.front;

toChars converts the number to a range of chars. front takes the first of them.

Similarly, you could also convert to a (d)string and take the first character:

    value = i.to!dstring[0];

Or if you want to appear clever, add i to '0':

    value = '0' + i;

I'd generally prefer toChars.front here. to!dstring[0] makes an allocation you don't need, and '0' + i is more obscure and bug-prone.
July 31, 2016
Am 31.07.2016 um 23:46 schrieb Seb:
> On Sunday, 31 July 2016 at 21:31:52 UTC, Darren wrote:
>> Hey, all.
>>
>> I'm pretty much a programming novice, so I hope you can bear with me. Does anyone know how I can change an int into a char equivalent?
>>
>> e.g.
>> int i = 5;
>> dchar value;
>> ?????
>> assert(value == '5');
>>
>> If I try and cast it to dchar, I get messed up output, and I'm not sure how to use toChars (if that can accomplish this).
>>
>> I can copy+paste the little exercise I'm working on if that helps?
>>
>> Thanks in advance!
> 
> Ehm how do you you want to represent 1_000 in one dchar? You need to format it, like here.
> 
>     import std.format : format;
>     assert("%d".format(1_000) == "1000");
> 
> Note that you get an array of dchars (=string), not a single one.

An immutable array of dchars is a dstring, not a string (which is an immutable array of chars). It is true however, that you should not convert to dchar, but to string (or dstring, if you want utf32, but i see no real reason for this, if you are only dealing with numbers), because of the reason mentioned above. Another solution for this would be using "to":

import std.conv : to;

void main()
{
    int i = 5;
    string value = i.to!string;
    assert(value == "5");
}

If you know that your int only has one digit, and you really want to get it as char, you can always use value[0].

July 31, 2016
That's a really informative response.  Thank you!