Thread overview
convert char[4] to uint at compile time
Dec 16, 2008
Moritz Warning
Dec 16, 2008
BCS
Dec 16, 2008
Moritz Warning
Dec 23, 2008
Janderson
Dec 23, 2008
Denis Koroskin
Dec 23, 2008
Moritz Warning
Dec 23, 2008
Moritz Warning
December 16, 2008
Hi,

I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:

union pp { char[4] str; uint num; }
const uint x = pp("abcd").num

This does also doesn't work:

const uint x = cast(uint) x"aa aa aa aa";


Any ideas?
December 16, 2008
Reply to Moritz,

> Hi,
> 
> I have problems to convert a char[4] to an uint at compile time. All
> variations (I've tried) of using an enum crashes dmd:
> 
> union pp { char[4] str; uint num; }
> const uint x = pp("abcd").num
> This does also doesn't work:
> 
> const uint x = cast(uint) x"aa aa aa aa";
> 
> Any ideas?
> 


template Go (char[4] arg)
{
   const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) | arg[3];
}

import std.stdio;
void main()
{
  writef("%x\n", Go!("Good"));
}


December 16, 2008
On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:

> Reply to Moritz,
> 
>> Hi,
>> 
>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>> 
>> union pp { char[4] str; uint num; }
>> const uint x = pp("abcd").num
>> This does also doesn't work:
>> 
>> const uint x = cast(uint) x"aa aa aa aa";
>> 
>> Any ideas?
>> 
>> 
> 
> template Go (char[4] arg)
> {
>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>     arg[3];
> }
> 
> import std.stdio;
> void main()
> {
>    writef("%x\n", Go!("Good"));
> }

Thanks!
That workaround should do it.

Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
December 23, 2008
Moritz Warning wrote:
> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
> 
>> Reply to Moritz,
>>
>>> Hi,
>>>
>>> I have problems to convert a char[4] to an uint at compile time. All
>>> variations (I've tried) of using an enum crashes dmd:
>>>
>>> union pp { char[4] str; uint num; }
>>> const uint x = pp("abcd").num
>>> This does also doesn't work:
>>>
>>> const uint x = cast(uint) x"aa aa aa aa";
>>>
>>> Any ideas?
>>>
>>>
>> template Go (char[4] arg)
>> {
>>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>     arg[3];
>> }
>>
>> import std.stdio;
>> void main()
>> {
>>    writef("%x\n", Go!("Good"));
>> }
> 
> Thanks!
> That workaround should do it.
> 
> Maybe it will be possible to just do cast(uint) "abcd" in the future. :>

That would only cast the pointer.  It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs".

-Joel
December 23, 2008
On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask@me.com> wrote:

> Moritz Warning wrote:
>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>
>>> Reply to Moritz,
>>>
>>>> Hi,
>>>>
>>>> I have problems to convert a char[4] to an uint at compile time. All
>>>> variations (I've tried) of using an enum crashes dmd:
>>>>
>>>> union pp { char[4] str; uint num; }
>>>> const uint x = pp("abcd").num
>>>> This does also doesn't work:
>>>>
>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>
>>>> Any ideas?
>>>>
>>>>
>>> template Go (char[4] arg)
>>> {
>>>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>>     arg[3];
>>> }
>>>
>>> import std.stdio;
>>> void main()
>>> {
>>>    writef("%x\n", Go!("Good"));
>>> }
>>  Thanks!
>> That workaround should do it.
>>  Maybe it will be possible to just do cast(uint) "abcd" in the future. :>
>
> That would only cast the pointer.  It should be something like : cast(uint)(*"abcs") or *cast(uint*) "abcs".
>
> -Joel

And what about endianness? You can't have a feature in a language that gives different results in different environment.
December 23, 2008
On Tue, 23 Dec 2008 00:07:08 -0800, Janderson wrote:

> Moritz Warning wrote:
>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>> 
>>> Reply to Moritz,
>>>
>>>> Hi,
>>>>
>>>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>>>>
>>>> union pp { char[4] str; uint num; }
>>>> const uint x = pp("abcd").num
>>>> This does also doesn't work:
>>>>
>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>
>>>> Any ideas?
>>>>
>>>>
>>> template Go (char[4] arg)
>>> {
>>>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>>     arg[3];
>>> }
>>>
>>> import std.stdio;
>>> void main()
>>> {
>>>    writef("%x\n", Go!("Good"));
>>> }
>> 
>> Thanks!
>> That workaround should do it.
>> 
>> Maybe it will be possible to just do cast(uint) "abcd" in the future.
>> :>
> 
> That would only cast the pointer.  It should be something like :
> cast(uint)(*"abcs") or *cast(uint*) "abcs".
> 
> -Joel

I like to see "abcd" being a value type like a decimal or hex value. A cast(uint) would be possible and nice in that case.
December 23, 2008
On Tue, 23 Dec 2008 13:16:28 +0300, Denis Koroskin wrote:

> On Tue, 23 Dec 2008 11:07:08 +0300, Janderson <ask@me.com> wrote:
> 
>> Moritz Warning wrote:
>>> On Tue, 16 Dec 2008 19:54:11 +0000, BCS wrote:
>>>
>>>> Reply to Moritz,
>>>>
>>>>> Hi,
>>>>>
>>>>> I have problems to convert a char[4] to an uint at compile time. All variations (I've tried) of using an enum crashes dmd:
>>>>>
>>>>> union pp { char[4] str; uint num; }
>>>>> const uint x = pp("abcd").num
>>>>> This does also doesn't work:
>>>>>
>>>>> const uint x = cast(uint) x"aa aa aa aa";
>>>>>
>>>>> Any ideas?
>>>>>
>>>>>
>>>> template Go (char[4] arg)
>>>> {
>>>>     const uint Go = (arg[0] << 24) | (arg[1] << 16) | (arg[2] << 8) |
>>>>     arg[3];
>>>> }
>>>>
>>>> import std.stdio;
>>>> void main()
>>>> {
>>>>    writef("%x\n", Go!("Good"));
>>>> }
>>>  Thanks!
>>> That workaround should do it.
>>>  Maybe it will be possible to just do cast(uint) "abcd" in the future.
>>> :>
>>
>> That would only cast the pointer.  It should be something like :
>> cast(uint)(*"abcs") or *cast(uint*) "abcs".
>>
>> -Joel
> 
> And what about endianness? You can't have a feature in a language that gives different results in different environment.

The use of uint in my example might be confusing.
I only needed an environment independent bit pattern of 4 bytes.
An integer is used because it's faster than comparing a char[4]
with DMD. :/

(GDC doesn't show such behavior)